sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
sequencelengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
sequencelengths
0
25
languages
sequencelengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
sequencelengths
0
352
processed_texts
sequencelengths
1
353
f6b479dbbf9061a5bfd8e0db67b4f6cd50d55478
# Multiturn Multimodal We want to generate synthetic data that able to understand position and relationship between multi-images and multi-audio, example as below, All notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/chatbot/multiturn-multimodal ## multi-images - [synthetic-multi-images-relationship.jsonl](synthetic-multi-images-relationship.jsonl), 100000 rows, 109MB. - Images at https://huggingface.co/datasets/mesolitica/translated-LLaVA-Pretrain/tree/main ### Example data ```python {'filename': ['00370/003705168.jpg', '00611/006111738.jpg'], 'filename_description': ['paco man edp gift set', "raspberry lemonade is a must in new york's famous cocktail scene"], 'instruction': 'What is related between picture 1 and picture 2', 'answer': "There is no direct relation between Picture 1 (Paco Rabanne 1 Million Edp Gift Set) and Picture 2 (Raspberry Lemonade being a must in New York's famous cocktail scene). Both are unrelated to each other. The first picture is an image of a perfume gift set, while the second picture is a depiction of a popular cocktail in New York City.", 'instruction_ms': 'Apakah yang berkaitan antara gambar 1 dan gambar 2', 'answer_ms': 'Tiada hubungan langsung antara Gambar 1 (Paco Rabanne 1 Million Edp Gift Set) dan Gambar 2 (Raspberry Lemonade menjadi must dalam adegan koktel terkenal di New York). Kedua-duanya tidak berkaitan antara satu sama lain. Gambar pertama ialah imej set hadiah minyak wangi, manakala gambar kedua ialah gambaran koktel popular di New York City.'} ``` ## multi-images multi-audio - [synthetic-multi-images-multi-audio-relationship.jsonl](synthetic-multi-images-multi-audio-relationship.jsonl), 59400 rows, 96.6 MB. - Images at https://huggingface.co/datasets/mesolitica/translated-LLaVA-Pretrain/tree/main - Audio from https://huggingface.co/datasets/mesolitica/malaysian-youtube-audio-instructions/tree/main ### Example data ```python {'filename': ['output-audio/3-2648-47.mp3', '00180/001805101.jpg'], 'filename_description': ['Saya mahu muka mereka terlihat beras, anda tahu apa yang saya maksudkan. Dan sanitizer. Dan kemudian ini adalah earphone. Sama-sama kalau airpod saya, anda tahu, hilang bateri. Saya tidak pasti jika saya patut membawa tripod saya kerana saya mungkin. Adakah saya akan melakukan TikTok di kafe? Saya tidak tahu tetapi tidak menyakiti untuk membawanya. Maksud saya, ia tidak begitu keras. Saya perlu membawa krim tangan saya. Dan kemudian bumbu. Dan lip balm. Dan kemudian kita siap untuk pergi.', 'a water wheel with moss growing on the wheels metal print by randall white'], 'instruction': 'What is related between audio 1 and picture 1', 'answer': "The audio and picture do not have a direct relation to each other. The audio is about preparing items for an outing, including sanitizer, earphones, a tripod, and various other personal items. The picture is a print of a water wheel with moss growing on it by Randall White. There is no connection between the audio's content and the picture's subject matter.", 'instruction_ms': 'Apakah yang berkaitan antara audio 1 dan gambar 1', 'answer_ms': 'Audio dan gambar tidak mempunyai hubungan langsung antara satu sama lain. Audio adalah mengenai penyediaan item untuk keluar, termasuk pembersih, fon telinga, tripod dan pelbagai barangan peribadi lain. Gambar itu ialah cetakan roda air dengan lumut yang tumbuh di atasnya oleh Randall White. Tiada kaitan antara kandungan audio dan subjek gambar.'} ``` ## multi-audio - [synthetic-multi-images-multi-audio-relationship.jsonl](synthetic-multi-images-multi-audio-relationship.jsonl), 25100 rows, 65.1 MB. - Audio from https://huggingface.co/datasets/mesolitica/malaysian-youtube-audio-instructions/tree/main ```python {'filename': ['output-audio/3-2080-38.mp3', 'output-audio/0-2823-0.mp3'], 'filename_description': ['Terima kasih Menteri. Saya jemput soalan tambahan yang kedua. Bagan Serai. Terima kasih Tuan Speaker. Berapakah jumlah kemalangan yang menyebabkan kematian disebabkan oleh pengaruh handphone, penggunaan handphone semasa mandu. Kerana guna handphone mandu ini dia macam mabuk lebih Tuan Speaker. Dan dia hilang orientasi. Dia tak tahu di mana traffic light, dia tak tahu dia di mana berada dan tiba-tiba dah sampai. Jadi apa kerajaan nak buat untuk menurunkan tabiat buruk menggunakan handphone semasa mandu.', 'dalam video tu saya dah kitamkan kening lah sebab benda tu kita mencuba so at least kita dah mencuba kita kan nak mencuba kan masa ni lah mencuba kan janganlah pula usia macam aku dah 50 pun nak cuba kenapa masa buat lagu raya cover tu tak boleh hijau sebab dia nak image ketupat macam Aina Abdul juga dia ketupat kita bawa image rambut tu warna hijau ketupat juga kan tapi dah habis raya after this memang nak reveal jugalah kan habis ni memang saya akan kekalkan image yang very very formal je lah'], 'instruction': 'What is related between audio 1 and audio 2', 'answer': 'Audio 1 and Audio 2 are unrelated as they discuss different topics. In Audio 1, the speaker is discussing the issue of using handphones while driving and its contribution to accidents. In Audio 2, the speaker is talking about making a cover song for Raya and the challenges they faced in creating the image for the video.', 'instruction_ms': 'Apakah yang berkaitan antara audio 1 dan audio 2', 'answer_ms': 'Audio 1 dan Audio 2 tidak berkaitan kerana mereka membincangkan topik yang berbeza. Dalam Audio 1, penceramah membincangkan isu menggunakan fon tangan semasa memandu dan sumbangannya kepada kemalangan. Dalam Audio 2, penceramah bercakap tentang membuat lagu penutup untuk Raya dan cabaran yang mereka hadapi dalam mencipta imej untuk video itu.'} ```
mesolitica/synthetic-multiturn-multimodal
[ "language:ms", "language:en", "license:mit", "region:us" ]
2024-01-28T12:14:58+00:00
{"language": ["ms", "en"], "license": "mit"}
2024-02-02T05:54:36+00:00
[]
[ "ms", "en" ]
TAGS #language-Malay (macrolanguage) #language-English #license-mit #region-us
# Multiturn Multimodal We want to generate synthetic data that able to understand position and relationship between multi-images and multi-audio, example as below, All notebooks at URL ## multi-images - URL, 100000 rows, 109MB. - Images at URL ### Example data ## multi-images multi-audio - URL, 59400 rows, 96.6 MB. - Images at URL - Audio from URL ### Example data ## multi-audio - URL, 25100 rows, 65.1 MB. - Audio from URL
[ "# Multiturn Multimodal\n\nWe want to generate synthetic data that able to understand position and relationship between multi-images and multi-audio, example as below,\n\nAll notebooks at URL", "## multi-images\n\n- URL, 100000 rows, 109MB.\n- Images at URL", "### Example data", "## multi-images multi-audio\n\n- URL, 59400 rows, 96.6 MB.\n- Images at URL\n- Audio from URL", "### Example data", "## multi-audio\n\n- URL, 25100 rows, 65.1 MB.\n- Audio from URL" ]
[ "TAGS\n#language-Malay (macrolanguage) #language-English #license-mit #region-us \n", "# Multiturn Multimodal\n\nWe want to generate synthetic data that able to understand position and relationship between multi-images and multi-audio, example as below,\n\nAll notebooks at URL", "## multi-images\n\n- URL, 100000 rows, 109MB.\n- Images at URL", "### Example data", "## multi-images multi-audio\n\n- URL, 59400 rows, 96.6 MB.\n- Images at URL\n- Audio from URL", "### Example data", "## multi-audio\n\n- URL, 25100 rows, 65.1 MB.\n- Audio from URL" ]
618a5d732e619864ba4e7c807ad1a080f4fa7769
# Dataset of Matou Sakura [Alter] (Fate Stay Night [UFOTABLE]) This is the dataset of Matou Sakura [Alter] (Fate Stay Night [UFOTABLE]), containing 60 images and their tags. The core tags of this character are `long_hair, ribbon, hair_ribbon, white_hair, red_eyes, purple_hair, breasts, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 60 | 49.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_alter_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 60 | 39.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_alter_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 120 | 73.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_alter_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 60 | 49.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_alter_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 120 | 88.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_alter_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/matou_sakura_alter_fatestaynightufotable', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, dark_persona, looking_at_viewer, solo, striped, smile, turtleneck, upper_body | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, long_sleeves, vertical-striped_dress, from_side, black_dress, dark_persona | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dark_persona | looking_at_viewer | solo | striped | smile | turtleneck | upper_body | long_sleeves | vertical-striped_dress | from_side | black_dress | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------|:----------|:--------|:-------------|:-------------|:---------------|:-------------------------|:------------|:--------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | | | | | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | | | | X | X | X | X |
CyberHarem/matou_sakura_alter_fatestaynightufotable
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-28T12:48:33+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-28T12:54:45+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Matou Sakura [Alter] (Fate Stay Night [UFOTABLE]) ============================================================ This is the dataset of Matou Sakura [Alter] (Fate Stay Night [UFOTABLE]), containing 60 images and their tags. The core tags of this character are 'long\_hair, ribbon, hair\_ribbon, white\_hair, red\_eyes, purple\_hair, breasts, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
4ed5969c08449c078816b00a724f07f9e5f7fb4a
# Dataset of Medea (Fate Stay Night [UFOTABLE]) This is the dataset of Medea (Fate Stay Night [UFOTABLE]), containing 34 images and their tags. The core tags of this character are `blue_hair, pointy_ears, purple_lips`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 34 | 27.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 34 | 21.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 59 | 34.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 34 | 27.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 59 | 43.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/medea_fatestaynightufotable', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, lipstick, choker, hood_up, dress, cape | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | lipstick | choker | hood_up | dress | cape | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:---------|:----------|:--------|:-------| | 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X |
CyberHarem/medea_fatestaynightufotable
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-28T12:54:59+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-28T12:58:11+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Medea (Fate Stay Night [UFOTABLE]) ============================================= This is the dataset of Medea (Fate Stay Night [UFOTABLE]), containing 34 images and their tags. The core tags of this character are 'blue\_hair, pointy\_ears, purple\_lips', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
0dd9a8988aafbd6eab23f17beff244277926a21c
<div align="center"> # Advanced RVC Inference [![Colab](https://img.shields.io/badge/Colab-Advanced%20RVC%20Inference-blue?style=for-the-badge&logo=googlecolab)](https://colab.research.google.com/github/ArkanDash/Advanced-RVC-Inference/blob/master/Advanced-RVC.ipynb) </div> ### Information Advanced RVC Inference presents itself as a state-of-the-art web UI crafted to streamline rapid and effortless inference. This comprehensive toolset encompasses a model downloader, a voice splitter, and the added efficiency of batch inference. Please support the original RVC. This inference won't be possible to make without it.<br /> [![Original RVC Repository](https://img.shields.io/badge/Github-Original%20RVC%20Repository-blue?style=for-the-badge&logo=github)](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI) #### Features - Support V1 & V2 Model ✅ - Youtube Audio Downloader ✅ - Demucs (Voice Splitter) [Internet required for downloading model] ✅ - Microphone Support ✅ - TTS Support ✅ - Model Downloader ✅ - Batch Inference (Beta) ✅ #### Currently Working - Settings 🛠 ### Installation 1. Install Dependencies <br /> ```bash pip install torch torchvision torchaudio pip install -r requirements.txt ``` 2. Install [ffmpeg](https://ffmpeg.org/) 3. Download [Hubert Model](https://huggingface.co/lj1995/VoiceConversionWebUI/blob/main/hubert_base.pt) 4. [OPTIONAL] To use rmvpe pitch extraction, download this [rvmpe.pt](https://huggingface.co/lj1995/VoiceConversionWebUI/blob/main/rmvpe.pt) ### Run WebUI <br /> For Windows: ```bash Open run.bat ``` For Other: ```bash python infer.py ```
varaslaw/deleteee2
[ "region:us" ]
2024-01-28T12:56:52+00:00
{}
2024-02-03T14:18:27+00:00
[]
[]
TAGS #region-us
<div align="center"> # Advanced RVC Inference ![Colab](URL </div> ### Information Advanced RVC Inference presents itself as a state-of-the-art web UI crafted to streamline rapid and effortless inference. This comprehensive toolset encompasses a model downloader, a voice splitter, and the added efficiency of batch inference. Please support the original RVC. This inference won't be possible to make without it.<br /> ![Original RVC Repository](URL #### Features - Support V1 & V2 Model - Youtube Audio Downloader - Demucs (Voice Splitter) [Internet required for downloading model] - Microphone Support - TTS Support - Model Downloader - Batch Inference (Beta) #### Currently Working - Settings ### Installation 1. Install Dependencies <br /> 2. Install ffmpeg 3. Download Hubert Model 4. [OPTIONAL] To use rmvpe pitch extraction, download this URL ### Run WebUI <br /> For Windows: For Other:
[ "# Advanced RVC Inference\n\n![Colab](URL\n</div>", "### Information\nAdvanced RVC Inference presents itself as a state-of-the-art web UI crafted to streamline rapid and effortless inference. This comprehensive toolset encompasses a model downloader, a voice splitter, and the added efficiency of batch inference.\n\nPlease support the original RVC. This inference won't be possible to make without it.<br />\n![Original RVC Repository](URL", "#### Features\n- Support V1 & V2 Model \n- Youtube Audio Downloader \n- Demucs (Voice Splitter) [Internet required for downloading model] \n- Microphone Support \n- TTS Support \n- Model Downloader \n- Batch Inference (Beta)", "#### Currently Working\n- Settings", "### Installation\n\n1. Install Dependencies <br />\n\n2. Install ffmpeg\n\n3. Download Hubert Model\n\n4. [OPTIONAL] To use rmvpe pitch extraction, download this URL", "### Run WebUI <br />\n\nFor Windows:\n\nFor Other:" ]
[ "TAGS\n#region-us \n", "# Advanced RVC Inference\n\n![Colab](URL\n</div>", "### Information\nAdvanced RVC Inference presents itself as a state-of-the-art web UI crafted to streamline rapid and effortless inference. This comprehensive toolset encompasses a model downloader, a voice splitter, and the added efficiency of batch inference.\n\nPlease support the original RVC. This inference won't be possible to make without it.<br />\n![Original RVC Repository](URL", "#### Features\n- Support V1 & V2 Model \n- Youtube Audio Downloader \n- Demucs (Voice Splitter) [Internet required for downloading model] \n- Microphone Support \n- TTS Support \n- Model Downloader \n- Batch Inference (Beta)", "#### Currently Working\n- Settings", "### Installation\n\n1. Install Dependencies <br />\n\n2. Install ffmpeg\n\n3. Download Hubert Model\n\n4. [OPTIONAL] To use rmvpe pitch extraction, download this URL", "### Run WebUI <br />\n\nFor Windows:\n\nFor Other:" ]
69f26c69f84d7abb6623891634a88443297e9318
# Dataset of Medusa (Fate Stay Night [UFOTABLE]) This is the dataset of Medusa (Fate Stay Night [UFOTABLE]), containing 24 images and their tags. The core tags of this character are `long_hair, purple_hair, very_long_hair, facial_mark, breasts, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 24 | 19.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medusa_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 24 | 15.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medusa_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 45 | 28.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medusa_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 24 | 19.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medusa_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 45 | 35.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medusa_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/medusa_fatestaynightufotable', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, blindfold, cleavage, bare_shoulders, forehead_mark, collar, detached_sleeves, strapless_dress, thighhighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blindfold | cleavage | bare_shoulders | forehead_mark | collar | detached_sleeves | strapless_dress | thighhighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:-----------|:-----------------|:----------------|:---------|:-------------------|:------------------|:-------------| | 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
CyberHarem/medusa_fatestaynightufotable
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-28T12:58:21+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-28T13:00:30+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Medusa (Fate Stay Night [UFOTABLE]) ============================================== This is the dataset of Medusa (Fate Stay Night [UFOTABLE]), containing 24 images and their tags. The core tags of this character are 'long\_hair, purple\_hair, very\_long\_hair, facial\_mark, breasts, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
212644f2e2986cba7d787a2505a02091ddb79361
# Dataset of Himuro Kane (Fate Stay Night [UFOTABLE]) This is the dataset of Himuro Kane (Fate Stay Night [UFOTABLE]), containing 11 images and their tags. The core tags of this character are `glasses, long_hair, brown_eyes, brown_hair, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 11 | 5.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 11 | 5.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 22 | 9.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 11 | 5.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 22 | 10.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/himuro_kane_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/himuro_kane_fatestaynightufotable', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, homurahara_academy_school_uniform | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | homurahara_academy_school_uniform | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X |
CyberHarem/himuro_kane_fatestaynightufotable
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-28T13:00:36+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-28T13:22:57+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Himuro Kane (Fate Stay Night [UFOTABLE]) =================================================== This is the dataset of Himuro Kane (Fate Stay Night [UFOTABLE]), containing 11 images and their tags. The core tags of this character are 'glasses, long\_hair, brown\_eyes, brown\_hair, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
709677063cfa55c70e9ca737edf4bb67ca4e6b51
# Dataset of Fujimura Taiga (Fate Stay Night [UFOTABLE]) This is the dataset of Fujimura Taiga (Fate Stay Night [UFOTABLE]), containing 73 images and their tags. The core tags of this character are `brown_hair, short_hair, brown_eyes, earrings`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 73 | 56.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 73 | 45.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 139 | 85.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 73 | 56.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 139 | 102.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/fujimura_taiga_fatestaynightufotable', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, jewelry, solo, smile, anime_coloring, looking_at_viewer | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | solo | smile | anime_coloring | looking_at_viewer | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:--------|:-----------------|:--------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X |
CyberHarem/fujimura_taiga_fatestaynightufotable
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-28T13:00:59+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-28T13:07:40+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Fujimura Taiga (Fate Stay Night [UFOTABLE]) ====================================================== This is the dataset of Fujimura Taiga (Fate Stay Night [UFOTABLE]), containing 73 images and their tags. The core tags of this character are 'brown\_hair, short\_hair, brown\_eyes, earrings', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
5fcae4dd1e75dfc00ca2f716ca681fbf76fa3294
# Dataset Card for [Alpaca Plus](https://huggingface.co/datasets/ErfanMoosaviMonazzah/alpaca-plus) Alpaca Plus is an enhanced version of the [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) dataset, which is a cleaned version of the [tatsu-lab/alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca) dataset. This progression signifies an enhancement in the quality and usability of the data. <br>In the process of creating Alpaca Plus, nearly **97% of instructions** were classified into **593 unique instruction types**. This classification provides a more granular understanding of the dataset and enhances its potential for various applications. ## Dataset Details Apart from `question_wh` and `question_yn`, which store instructions that are either WH-questions or yes/no questions respectively, all other types focus solely on one type of instruction. In case of question_wh or question_yn you can use corresponding value of instruction_keyword columns of the dataset to see the exact word. <br>Below is a list of instruction types that contain more than 100 instructions (unk represent unclassified instructions):<br> | Instruction Type | Frequency | |------------------|-------| | generate | 4837 | | create | 3785 | | question_wh | 3763 | | describe | 2989 | | write | 2891 | | explain | 2111 | | name | 1982 | | identify | 1662 | | unk | 1634 | | find | 1480 | | rewrite | 1382 | | suggest | 1150 | | list | 1137 | | classify | 1002 | | provide | 999 | | give | 952 | | summarize | 803 | | construct | 779 | | edit | 721 | | come | 708 | | design | 696 | | compare | 654 | | compose | 583 | | analyze | 553 | | make | 531 | | convert | 480 | | categorize | 474 | | calculate | 431 | | determine | 398 | | tell | 391 | | add | 286 | | develop | 275 | | question_yn | 267 | | change | 246 | | take | 217 | | select | 216 | | translate | 206 | | evaluate | 191 | | imagine | 188 | | brainstorm | 184 | | choose | 173 | | arrange | 169 | | predict | 168 | | rearrange | 168 | | output | 164 | | outline | 163 | | sort | 138 | | read | 137 | | replace | 137 | | reword | 127 | | formulate | 124 | | complete | 118 | | paraphrase | 117 | | propose | 114 | | answer | 105 | | transform | 105 | | pick | 101 |
ErfanMoosaviMonazzah/alpaca-plus
[ "task_categories:conversational", "task_categories:text-generation", "size_categories:10K<n<100K", "language:en", "instruction-finetuning", "region:us" ]
2024-01-28T13:21:17+00:00
{"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation"], "pretty_name": "Alpaca Plus", "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "instruction_keyword", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 89735513, "num_examples": 51760}], "download_size": 48797733, "dataset_size": 89735513}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["instruction-finetuning"]}
2024-02-03T09:43:42+00:00
[]
[ "en" ]
TAGS #task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #instruction-finetuning #region-us
Dataset Card for Alpaca Plus ============================ Alpaca Plus is an enhanced version of the yahma/alpaca-cleaned dataset, which is a cleaned version of the tatsu-lab/alpaca dataset. This progression signifies an enhancement in the quality and usability of the data. In the process of creating Alpaca Plus, nearly 97% of instructions were classified into 593 unique instruction types. This classification provides a more granular understanding of the dataset and enhances its potential for various applications. Dataset Details --------------- Apart from 'question\_wh' and 'question\_yn', which store instructions that are either WH-questions or yes/no questions respectively, all other types focus solely on one type of instruction. In case of question\_wh or question\_yn you can use corresponding value of instruction\_keyword columns of the dataset to see the exact word. Below is a list of instruction types that contain more than 100 instructions (unk represent unclassified instructions):
[]
[ "TAGS\n#task_categories-conversational #task_categories-text-generation #size_categories-10K<n<100K #language-English #instruction-finetuning #region-us \n" ]
7739da00ba1fac4acdfaf9887da3eeec6d06ca0b
# Dataset Card for Evaluation run of aihub-app/ZySec-7B-v1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [aihub-app/ZySec-7B-v1](https://huggingface.co/aihub-app/ZySec-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_aihub-app__ZySec-7B-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T13:29:55.767663](https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__ZySec-7B-v1/blob/main/results_2024-01-28T13-29-55.767663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5992943703774103, "acc_stderr": 0.03331859785777884, "acc_norm": 0.6062149526919669, "acc_norm_stderr": 0.03403027227512744, "mc1": 0.4149326805385557, "mc1_stderr": 0.017248314465805978, "mc2": 0.5649322732323967, "mc2_stderr": 0.016365165663274596 }, "harness|arc:challenge|25": { "acc": 0.5998293515358362, "acc_stderr": 0.014317197787809181, "acc_norm": 0.6348122866894198, "acc_norm_stderr": 0.014070265519268802 }, "harness|hellaswag|10": { "acc": 0.6620195180242979, "acc_stderr": 0.004720551323547126, "acc_norm": 0.8501294562836088, "acc_norm_stderr": 0.003562149890962717 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.042849586397534015, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.042849586397534015 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.618421052631579, "acc_stderr": 0.03953173377749194, "acc_norm": 0.618421052631579, "acc_norm_stderr": 0.03953173377749194 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6377358490566037, "acc_stderr": 0.0295822451283843, "acc_norm": 0.6377358490566037, "acc_norm_stderr": 0.0295822451283843 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.0373362665538351, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.0373362665538351 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4978723404255319, "acc_stderr": 0.03268572658667493, "acc_norm": 0.4978723404255319, "acc_norm_stderr": 0.03268572658667493 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.47586206896551725, "acc_stderr": 0.041618085035015295, "acc_norm": 0.47586206896551725, "acc_norm_stderr": 0.041618085035015295 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3862433862433862, "acc_stderr": 0.025075981767601684, "acc_norm": 0.3862433862433862, "acc_norm_stderr": 0.025075981767601684 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7451612903225806, "acc_stderr": 0.024790118459332208, "acc_norm": 0.7451612903225806, "acc_norm_stderr": 0.024790118459332208 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7393939393939394, "acc_stderr": 0.034277431758165236, "acc_norm": 0.7393939393939394, "acc_norm_stderr": 0.034277431758165236 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7121212121212122, "acc_stderr": 0.03225883512300992, "acc_norm": 0.7121212121212122, "acc_norm_stderr": 0.03225883512300992 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8290155440414507, "acc_stderr": 0.02717121368316453, "acc_norm": 0.8290155440414507, "acc_norm_stderr": 0.02717121368316453 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5897435897435898, "acc_stderr": 0.0249393139069408, "acc_norm": 0.5897435897435898, "acc_norm_stderr": 0.0249393139069408 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524572, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524572 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7963302752293578, "acc_stderr": 0.017266742087630797, "acc_norm": 0.7963302752293578, "acc_norm_stderr": 0.017266742087630797 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5694444444444444, "acc_stderr": 0.033769221512523345, "acc_norm": 0.5694444444444444, "acc_norm_stderr": 0.033769221512523345 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591361, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7088607594936709, "acc_stderr": 0.02957160106575337, "acc_norm": 0.7088607594936709, "acc_norm_stderr": 0.02957160106575337 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.03252113489929188, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.03252113489929188 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6793893129770993, "acc_stderr": 0.04093329229834278, "acc_norm": 0.6793893129770993, "acc_norm_stderr": 0.04093329229834278 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.039418975265163025, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.039418975265163025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6993865030674846, "acc_stderr": 0.03602511318806771, "acc_norm": 0.6993865030674846, "acc_norm_stderr": 0.03602511318806771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4017857142857143, "acc_stderr": 0.04653333146973646, "acc_norm": 0.4017857142857143, "acc_norm_stderr": 0.04653333146973646 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7803320561941252, "acc_stderr": 0.014805384478371163, "acc_norm": 0.7803320561941252, "acc_norm_stderr": 0.014805384478371163 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6647398843930635, "acc_stderr": 0.025416003773165555, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.025416003773165555 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2994413407821229, "acc_stderr": 0.015318257745976708, "acc_norm": 0.2994413407821229, "acc_norm_stderr": 0.015318257745976708 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.630718954248366, "acc_stderr": 0.027634176689602656, "acc_norm": 0.630718954248366, "acc_norm_stderr": 0.027634176689602656 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6784565916398714, "acc_stderr": 0.026527724079528872, "acc_norm": 0.6784565916398714, "acc_norm_stderr": 0.026527724079528872 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.654320987654321, "acc_stderr": 0.02646248777700187, "acc_norm": 0.654320987654321, "acc_norm_stderr": 0.02646248777700187 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.029736592526424438, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.029736592526424438 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42242503259452413, "acc_stderr": 0.012615600475734921, "acc_norm": 0.42242503259452413, "acc_norm_stderr": 0.012615600475734921 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6470588235294118, "acc_stderr": 0.0290294228156814, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.0290294228156814 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.630718954248366, "acc_stderr": 0.01952431674486635, "acc_norm": 0.630718954248366, "acc_norm_stderr": 0.01952431674486635 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6530612244897959, "acc_stderr": 0.030472526026726492, "acc_norm": 0.6530612244897959, "acc_norm_stderr": 0.030472526026726492 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7910447761194029, "acc_stderr": 0.028748298931728655, "acc_norm": 0.7910447761194029, "acc_norm_stderr": 0.028748298931728655 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.4149326805385557, "mc1_stderr": 0.017248314465805978, "mc2": 0.5649322732323967, "mc2_stderr": 0.016365165663274596 }, "harness|winogrande|5": { "acc": 0.7813733228097869, "acc_stderr": 0.011616198215773229 }, "harness|gsm8k|5": { "acc": 0.23199393479909022, "acc_stderr": 0.01162687317509241 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_aihub-app__ZySec-7B-v1
[ "region:us" ]
2024-01-28T13:32:16+00:00
{"pretty_name": "Evaluation run of aihub-app/ZySec-7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [aihub-app/ZySec-7B-v1](https://huggingface.co/aihub-app/ZySec-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aihub-app__ZySec-7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T13:29:55.767663](https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__ZySec-7B-v1/blob/main/results_2024-01-28T13-29-55.767663.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5992943703774103,\n \"acc_stderr\": 0.03331859785777884,\n \"acc_norm\": 0.6062149526919669,\n \"acc_norm_stderr\": 0.03403027227512744,\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5649322732323967,\n \"mc2_stderr\": 0.016365165663274596\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809181,\n \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.014070265519268802\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6620195180242979,\n \"acc_stderr\": 0.004720551323547126,\n \"acc_norm\": 0.8501294562836088,\n \"acc_norm_stderr\": 0.003562149890962717\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667493,\n \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667493\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630797,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630797\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.033769221512523345,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.033769221512523345\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371163,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371163\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165555,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n \"acc_stderr\": 0.015318257745976708,\n \"acc_norm\": 0.2994413407821229,\n \"acc_norm_stderr\": 0.015318257745976708\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602656,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602656\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5649322732323967,\n \"mc2_stderr\": 0.016365165663274596\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23199393479909022,\n \"acc_stderr\": 0.01162687317509241\n }\n}\n```", "repo_url": "https://huggingface.co/aihub-app/ZySec-7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["**/details_harness|winogrande|5_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T13-29-55.767663.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T13_29_55.767663", "path": ["results_2024-01-28T13-29-55.767663.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T13-29-55.767663.parquet"]}]}]}
2024-01-28T13:32:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of aihub-app/ZySec-7B-v1 Dataset automatically created during the evaluation run of model aihub-app/ZySec-7B-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T13:29:55.767663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of aihub-app/ZySec-7B-v1\n\n\n\nDataset automatically created during the evaluation run of model aihub-app/ZySec-7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:29:55.767663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of aihub-app/ZySec-7B-v1\n\n\n\nDataset automatically created during the evaluation run of model aihub-app/ZySec-7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:29:55.767663(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6010c881582a11184e68898c2df8a9ded0166982
# Dataset Card for "libri2Mix_test_synth" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/libri2Mix_test_synth
[ "region:us" ]
2024-01-28T13:41:21+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "original", "path": "data/original-*"}, {"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k_12bps", "path": "data/encodec_24k_12bps-*"}, {"split": "encodec_24k_1_5bps", "path": "data/encodec_24k_1_5bps-*"}, {"split": "encodec_24k_24bps", "path": "data/encodec_24k_24bps-*"}, {"split": "encodec_24k_3bps", "path": "data/encodec_24k_3bps-*"}, {"split": "encodec_24k_6bps", "path": "data/encodec_24k_6bps-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}], "splits": [{"name": "original", "num_bytes": 322456512.0, "num_examples": 2000}, {"name": "academicodec_hifi_16k_320d", "num_bytes": 321978580.0, "num_examples": 2000}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 321978580.0, "num_examples": 2000}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 483045460.0, "num_examples": 2000}, {"name": "audiodec_24k_320d", "num_bytes": 484079100.0, "num_examples": 2000}, {"name": "dac_16k", "num_bytes": 322456512.0, "num_examples": 2000}, {"name": "dac_24k", "num_bytes": 483593732.0, "num_examples": 2000}, {"name": "dac_44k", "num_bytes": 888451922.0, "num_examples": 2000}, {"name": "encodec_24k_12bps", "num_bytes": 483593732.0, "num_examples": 2000}, {"name": "encodec_24k_1_5bps", "num_bytes": 483593732.0, "num_examples": 2000}, {"name": "encodec_24k_24bps", "num_bytes": 483593732.0, "num_examples": 2000}, {"name": "encodec_24k_3bps", "num_bytes": 483593732.0, "num_examples": 2000}, {"name": "encodec_24k_6bps", "num_bytes": 483593732.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 322444980.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 322444980.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 322524512.0, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 322524512.0, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 322524512.0, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 322524512.0, "num_examples": 2000}, {"name": "speech_tokenizer_16k", "num_bytes": 322995060.0, "num_examples": 2000}], "download_size": 8277679548, "dataset_size": 8303992126.0}}
2024-01-28T13:48:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for "libri2Mix_test_synth" More Information needed
[ "# Dataset Card for \"libri2Mix_test_synth\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"libri2Mix_test_synth\"\n\nMore Information needed" ]
a4c4e9d059879897897fb17670e6e70484d06d81
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> A small dataset of 42 high-resolution images of cryptocurrency coins with clipped *.txt descriptions. It can be used to extend datasets or for tuning models.
SpectralDoor/cryptocurrency-coins-hi-res
[ "task_categories:text-to-image", "size_categories:n<1K", "language:en", "license:mit", "region:us" ]
2024-01-28T13:42:28+00:00
{"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"]}
2024-01-28T13:53:37+00:00
[]
[ "en" ]
TAGS #task_categories-text-to-image #size_categories-n<1K #language-English #license-mit #region-us
# Dataset Card for Dataset Name A small dataset of 42 high-resolution images of cryptocurrency coins with clipped *.txt descriptions. It can be used to extend datasets or for tuning models.
[ "# Dataset Card for Dataset Name\n\n\n\nA small dataset of 42 high-resolution images of cryptocurrency coins with clipped *.txt descriptions. \nIt can be used to extend datasets or for tuning models." ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #language-English #license-mit #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nA small dataset of 42 high-resolution images of cryptocurrency coins with clipped *.txt descriptions. \nIt can be used to extend datasets or for tuning models." ]
36655778fc1eafd57eb007a62a0286d7621a716b
# Dataset Card for Evaluation run of freecs/ThetaWave-14B-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [freecs/ThetaWave-14B-v0.1](https://huggingface.co/freecs/ThetaWave-14B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_freecs__ThetaWave-14B-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T13:45:58.918363](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-14B-v0.1/blob/main/results_2024-01-28T13-45-58.918363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5965032661159082, "acc_stderr": 0.032717164407508395, "acc_norm": 0.6089127016997782, "acc_norm_stderr": 0.033610152499338415, "mc1": 0.25091799265605874, "mc1_stderr": 0.015176985027707696, "mc2": 0.5040944748516392, "mc2_stderr": 0.01650838155954231 }, "harness|arc:challenge|25": { "acc": 0.3703071672354949, "acc_stderr": 0.014111298751674948, "acc_norm": 0.4283276450511945, "acc_norm_stderr": 0.014460496367599013 }, "harness|hellaswag|10": { "acc": 0.3354909380601474, "acc_stderr": 0.004711968379069014, "acc_norm": 0.47092212706632147, "acc_norm_stderr": 0.004981336318033636 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353228, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7361111111111112, "acc_stderr": 0.03685651095897532, "acc_norm": 0.7361111111111112, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.037038511930995215, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.037038511930995215 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146267, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146267 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5172413793103449, "acc_stderr": 0.04164188720169375, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.04164188720169375 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3915343915343915, "acc_stderr": 0.025138091388851112, "acc_norm": 0.3915343915343915, "acc_norm_stderr": 0.025138091388851112 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768177, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768177 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6903225806451613, "acc_stderr": 0.026302774983517414, "acc_norm": 0.6903225806451613, "acc_norm_stderr": 0.026302774983517414 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198906, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198906 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8238341968911918, "acc_stderr": 0.027493504244548057, "acc_norm": 0.8238341968911918, "acc_norm_stderr": 0.027493504244548057 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5743589743589743, "acc_stderr": 0.02506909438729653, "acc_norm": 0.5743589743589743, "acc_norm_stderr": 0.02506909438729653 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34074074074074073, "acc_stderr": 0.02889774874113114, "acc_norm": 0.34074074074074073, "acc_norm_stderr": 0.02889774874113114 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059285, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059285 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.304635761589404, "acc_stderr": 0.03757949922943343, "acc_norm": 0.304635761589404, "acc_norm_stderr": 0.03757949922943343 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8146788990825689, "acc_stderr": 0.01665927970029582, "acc_norm": 0.8146788990825689, "acc_norm_stderr": 0.01665927970029582 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.03385177976044811, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.03385177976044811 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.029331162294251735, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.029331162294251735 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.040393149787245605, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.040393149787245605 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.034624199316156234, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.034624199316156234 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690879, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690879 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400407002, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400407002 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7777777777777778, "acc_stderr": 0.014866821664709588, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.014866821664709588 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6473988439306358, "acc_stderr": 0.025722802200895817, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.025722802200895817 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.31508379888268156, "acc_stderr": 0.015536850852473642, "acc_norm": 0.31508379888268156, "acc_norm_stderr": 0.015536850852473642 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757485, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757485 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6752411575562701, "acc_stderr": 0.02659678228769704, "acc_norm": 0.6752411575562701, "acc_norm_stderr": 0.02659678228769704 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7067901234567902, "acc_stderr": 0.025329888171900926, "acc_norm": 0.7067901234567902, "acc_norm_stderr": 0.025329888171900926 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236848, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236848 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4348109517601043, "acc_stderr": 0.012661233805616302, "acc_norm": 0.4348109517601043, "acc_norm_stderr": 0.012661233805616302 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.625, "acc_stderr": 0.029408372932278746, "acc_norm": 0.625, "acc_norm_stderr": 0.029408372932278746 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6339869281045751, "acc_stderr": 0.01948802574552967, "acc_norm": 0.6339869281045751, "acc_norm_stderr": 0.01948802574552967 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128448, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128448 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7810945273631841, "acc_stderr": 0.029239174636647, "acc_norm": 0.7810945273631841, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.25091799265605874, "mc1_stderr": 0.015176985027707696, "mc2": 0.5040944748516392, "mc2_stderr": 0.01650838155954231 }, "harness|winogrande|5": { "acc": 0.654301499605367, "acc_stderr": 0.013366596951934375 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_freecs__ThetaWave-14B-v0.1
[ "region:us" ]
2024-01-28T13:48:18+00:00
{"pretty_name": "Evaluation run of freecs/ThetaWave-14B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [freecs/ThetaWave-14B-v0.1](https://huggingface.co/freecs/ThetaWave-14B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__ThetaWave-14B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T13:45:58.918363](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-14B-v0.1/blob/main/results_2024-01-28T13-45-58.918363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5965032661159082,\n \"acc_stderr\": 0.032717164407508395,\n \"acc_norm\": 0.6089127016997782,\n \"acc_norm_stderr\": 0.033610152499338415,\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707696,\n \"mc2\": 0.5040944748516392,\n \"mc2_stderr\": 0.01650838155954231\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3703071672354949,\n \"acc_stderr\": 0.014111298751674948,\n \"acc_norm\": 0.4283276450511945,\n \"acc_norm_stderr\": 0.014460496367599013\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3354909380601474,\n \"acc_stderr\": 0.004711968379069014,\n \"acc_norm\": 0.47092212706632147,\n \"acc_norm_stderr\": 0.004981336318033636\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.037038511930995215,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.037038511930995215\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851112,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851112\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400407002,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400407002\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.014866821664709588,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.014866821664709588\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895817,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895817\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31508379888268156,\n \"acc_stderr\": 0.015536850852473642,\n \"acc_norm\": 0.31508379888268156,\n \"acc_norm_stderr\": 0.015536850852473642\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.02659678228769704,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.02659678228769704\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n \"acc_stderr\": 0.012661233805616302,\n \"acc_norm\": 0.4348109517601043,\n \"acc_norm_stderr\": 0.012661233805616302\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.01948802574552967,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.01948802574552967\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707696,\n \"mc2\": 0.5040944748516392,\n \"mc2_stderr\": 0.01650838155954231\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.013366596951934375\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/freecs/ThetaWave-14B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-45-58.918363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["**/details_harness|winogrande|5_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T13-45-58.918363.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T13_45_58.918363", "path": ["results_2024-01-28T13-45-58.918363.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T13-45-58.918363.parquet"]}]}]}
2024-01-28T13:48:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of freecs/ThetaWave-14B-v0.1 Dataset automatically created during the evaluation run of model freecs/ThetaWave-14B-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T13:45:58.918363(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of freecs/ThetaWave-14B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-14B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:45:58.918363(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of freecs/ThetaWave-14B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-14B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:45:58.918363(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c8fe019d8797a68710e25e83ca21275d6bb0f5d8
# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_001 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Lvxy1117/amber_fine_tune_001](https://huggingface.co/Lvxy1117/amber_fine_tune_001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T13:46:59.201897](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001/blob/main/results_2024-01-28T13-46-59.201897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.31138101151746983, "acc_stderr": 0.0325934760300863, "acc_norm": 0.3139371329907083, "acc_norm_stderr": 0.033375512329911525, "mc1": 0.2876376988984088, "mc1_stderr": 0.015846315101394805, "mc2": 0.429338384075007, "mc2_stderr": 0.015517791037983605 }, "harness|arc:challenge|25": { "acc": 0.4121160409556314, "acc_stderr": 0.014383915302225398, "acc_norm": 0.44795221843003413, "acc_norm_stderr": 0.01453201149821167 }, "harness|hellaswag|10": { "acc": 0.5683130850428202, "acc_stderr": 0.0049429906231311166, "acc_norm": 0.7378012348137821, "acc_norm_stderr": 0.004389312748012154 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.2962962962962963, "acc_stderr": 0.03944624162501117, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.03944624162501117 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.2894736842105263, "acc_stderr": 0.03690677986137282, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.03690677986137282 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.3660377358490566, "acc_stderr": 0.02964781353936525, "acc_norm": 0.3660377358490566, "acc_norm_stderr": 0.02964781353936525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.3055555555555556, "acc_stderr": 0.03852084696008534, "acc_norm": 0.3055555555555556, "acc_norm_stderr": 0.03852084696008534 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.34104046242774566, "acc_stderr": 0.036146654241808254, "acc_norm": 0.34104046242774566, "acc_norm_stderr": 0.036146654241808254 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.17647058823529413, "acc_stderr": 0.03793281185307812, "acc_norm": 0.17647058823529413, "acc_norm_stderr": 0.03793281185307812 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.3191489361702128, "acc_stderr": 0.030472973363380045, "acc_norm": 0.3191489361702128, "acc_norm_stderr": 0.030472973363380045 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.31724137931034485, "acc_stderr": 0.03878352372138622, "acc_norm": 0.31724137931034485, "acc_norm_stderr": 0.03878352372138622 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2566137566137566, "acc_stderr": 0.022494510767503154, "acc_norm": 0.2566137566137566, "acc_norm_stderr": 0.022494510767503154 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.29365079365079366, "acc_stderr": 0.040735243221471276, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.040735243221471276 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3064516129032258, "acc_stderr": 0.026226485652553887, "acc_norm": 0.3064516129032258, "acc_norm_stderr": 0.026226485652553887 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.21182266009852216, "acc_stderr": 0.028748983689941065, "acc_norm": 0.21182266009852216, "acc_norm_stderr": 0.028748983689941065 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.296969696969697, "acc_stderr": 0.03567969772268049, "acc_norm": 0.296969696969697, "acc_norm_stderr": 0.03567969772268049 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.3383838383838384, "acc_stderr": 0.03371124142626302, "acc_norm": 0.3383838383838384, "acc_norm_stderr": 0.03371124142626302 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.32124352331606215, "acc_stderr": 0.033699508685490674, "acc_norm": 0.32124352331606215, "acc_norm_stderr": 0.033699508685490674 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2948717948717949, "acc_stderr": 0.023119362758232287, "acc_norm": 0.2948717948717949, "acc_norm_stderr": 0.023119362758232287 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.31932773109243695, "acc_stderr": 0.0302839955258844, "acc_norm": 0.31932773109243695, "acc_norm_stderr": 0.0302839955258844 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.19205298013245034, "acc_stderr": 0.03216298420593612, "acc_norm": 0.19205298013245034, "acc_norm_stderr": 0.03216298420593612 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.30275229357798167, "acc_stderr": 0.019698711434756353, "acc_norm": 0.30275229357798167, "acc_norm_stderr": 0.019698711434756353 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.38425925925925924, "acc_stderr": 0.03317354514310742, "acc_norm": 0.38425925925925924, "acc_norm_stderr": 0.03317354514310742 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.28431372549019607, "acc_stderr": 0.03166009679399812, "acc_norm": 0.28431372549019607, "acc_norm_stderr": 0.03166009679399812 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.3206751054852321, "acc_stderr": 0.030381931949990417, "acc_norm": 0.3206751054852321, "acc_norm_stderr": 0.030381931949990417 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.3901345291479821, "acc_stderr": 0.03273766725459156, "acc_norm": 0.3901345291479821, "acc_norm_stderr": 0.03273766725459156 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.3816793893129771, "acc_stderr": 0.0426073515764456, "acc_norm": 0.3816793893129771, "acc_norm_stderr": 0.0426073515764456 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2644628099173554, "acc_stderr": 0.040261875275912025, "acc_norm": 0.2644628099173554, "acc_norm_stderr": 0.040261875275912025 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.2777777777777778, "acc_stderr": 0.043300437496507437, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.043300437496507437 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2883435582822086, "acc_stderr": 0.03559039531617342, "acc_norm": 0.2883435582822086, "acc_norm_stderr": 0.03559039531617342 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.2815533980582524, "acc_stderr": 0.04453254836326467, "acc_norm": 0.2815533980582524, "acc_norm_stderr": 0.04453254836326467 }, "harness|hendrycksTest-marketing|5": { "acc": 0.38461538461538464, "acc_stderr": 0.03187195347942466, "acc_norm": 0.38461538461538464, "acc_norm_stderr": 0.03187195347942466 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.36015325670498083, "acc_stderr": 0.01716636247136928, "acc_norm": 0.36015325670498083, "acc_norm_stderr": 0.01716636247136928 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.31213872832369943, "acc_stderr": 0.024946792225272314, "acc_norm": 0.31213872832369943, "acc_norm_stderr": 0.024946792225272314 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2346368715083799, "acc_stderr": 0.014173044098303653, "acc_norm": 0.2346368715083799, "acc_norm_stderr": 0.014173044098303653 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.3366013071895425, "acc_stderr": 0.027057974624494382, "acc_norm": 0.3366013071895425, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.35691318327974275, "acc_stderr": 0.027210420375934033, "acc_norm": 0.35691318327974275, "acc_norm_stderr": 0.027210420375934033 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.30864197530864196, "acc_stderr": 0.025702640260603753, "acc_norm": 0.30864197530864196, "acc_norm_stderr": 0.025702640260603753 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2695035460992908, "acc_stderr": 0.026469036818590627, "acc_norm": 0.2695035460992908, "acc_norm_stderr": 0.026469036818590627 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.28748370273794005, "acc_stderr": 0.0115593373557085, "acc_norm": 0.28748370273794005, "acc_norm_stderr": 0.0115593373557085 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3125, "acc_stderr": 0.02815637344037142, "acc_norm": 0.3125, "acc_norm_stderr": 0.02815637344037142 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.3104575163398693, "acc_stderr": 0.018718067052623227, "acc_norm": 0.3104575163398693, "acc_norm_stderr": 0.018718067052623227 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.3090909090909091, "acc_stderr": 0.044262946482000985, "acc_norm": 0.3090909090909091, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2857142857142857, "acc_stderr": 0.028920583220675592, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.028920583220675592 }, "harness|hendrycksTest-sociology|5": { "acc": 0.3333333333333333, "acc_stderr": 0.03333333333333334, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.03333333333333334 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-virology|5": { "acc": 0.3674698795180723, "acc_stderr": 0.03753267402120574, "acc_norm": 0.3674698795180723, "acc_norm_stderr": 0.03753267402120574 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.38011695906432746, "acc_stderr": 0.037229657413855394, "acc_norm": 0.38011695906432746, "acc_norm_stderr": 0.037229657413855394 }, "harness|truthfulqa:mc|0": { "mc1": 0.2876376988984088, "mc1_stderr": 0.015846315101394805, "mc2": 0.429338384075007, "mc2_stderr": 0.015517791037983605 }, "harness|winogrande|5": { "acc": 0.6408839779005525, "acc_stderr": 0.013483115202120234 }, "harness|gsm8k|5": { "acc": 0.03639120545868082, "acc_stderr": 0.005158113489231189 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001
[ "region:us" ]
2024-01-28T13:48:47+00:00
{"pretty_name": "Evaluation run of Lvxy1117/amber_fine_tune_001", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lvxy1117/amber_fine_tune_001](https://huggingface.co/Lvxy1117/amber_fine_tune_001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T13:46:59.201897](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001/blob/main/results_2024-01-28T13-46-59.201897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.31138101151746983,\n \"acc_stderr\": 0.0325934760300863,\n \"acc_norm\": 0.3139371329907083,\n \"acc_norm_stderr\": 0.033375512329911525,\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.015846315101394805,\n \"mc2\": 0.429338384075007,\n \"mc2_stderr\": 0.015517791037983605\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4121160409556314,\n \"acc_stderr\": 0.014383915302225398,\n \"acc_norm\": 0.44795221843003413,\n \"acc_norm_stderr\": 0.01453201149821167\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5683130850428202,\n \"acc_stderr\": 0.0049429906231311166,\n \"acc_norm\": 0.7378012348137821,\n \"acc_norm_stderr\": 0.004389312748012154\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03944624162501117,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03944624162501117\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.02964781353936525,\n \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.02964781353936525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307812,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307812\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138622,\n \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138622\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3064516129032258,\n \"acc_stderr\": 0.026226485652553887,\n \"acc_norm\": 0.3064516129032258,\n \"acc_norm_stderr\": 0.026226485652553887\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941065,\n \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941065\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268049,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268049\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232287,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232287\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593612,\n \"acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593612\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.30275229357798167,\n \"acc_stderr\": 0.019698711434756353,\n \"acc_norm\": 0.30275229357798167,\n \"acc_norm_stderr\": 0.019698711434756353\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3206751054852321,\n \"acc_stderr\": 0.030381931949990417,\n \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.030381931949990417\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.3901345291479821,\n \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.0426073515764456,\n \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.0426073515764456\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2644628099173554,\n \"acc_stderr\": 0.040261875275912025,\n \"acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.040261875275912025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326467,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326467\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.03187195347942466,\n \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.03187195347942466\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.36015325670498083,\n \"acc_stderr\": 0.01716636247136928,\n \"acc_norm\": 0.36015325670498083,\n \"acc_norm_stderr\": 0.01716636247136928\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n \"acc_stderr\": 0.014173044098303653,\n \"acc_norm\": 0.2346368715083799,\n \"acc_norm_stderr\": 0.014173044098303653\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3366013071895425,\n \"acc_stderr\": 0.027057974624494382,\n \"acc_norm\": 0.3366013071895425,\n \"acc_norm_stderr\": 0.027057974624494382\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.35691318327974275,\n \"acc_stderr\": 0.027210420375934033,\n \"acc_norm\": 0.35691318327974275,\n \"acc_norm_stderr\": 0.027210420375934033\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.30864197530864196,\n \"acc_stderr\": 0.025702640260603753,\n \"acc_norm\": 0.30864197530864196,\n \"acc_norm_stderr\": 0.025702640260603753\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n \"acc_stderr\": 0.0115593373557085,\n \"acc_norm\": 0.28748370273794005,\n \"acc_norm_stderr\": 0.0115593373557085\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3104575163398693,\n \"acc_stderr\": 0.018718067052623227,\n \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.018718067052623227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675592,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675592\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.38011695906432746,\n \"acc_stderr\": 0.037229657413855394,\n \"acc_norm\": 0.38011695906432746,\n \"acc_norm_stderr\": 0.037229657413855394\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.015846315101394805,\n \"mc2\": 0.429338384075007,\n \"mc2_stderr\": 0.015517791037983605\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120234\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03639120545868082,\n \"acc_stderr\": 0.005158113489231189\n }\n}\n```", "repo_url": "https://huggingface.co/Lvxy1117/amber_fine_tune_001", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["**/details_harness|winogrande|5_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T13-46-59.201897.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T13_46_59.201897", "path": ["results_2024-01-28T13-46-59.201897.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T13-46-59.201897.parquet"]}]}]}
2024-01-28T13:49:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_001 Dataset automatically created during the evaluation run of model Lvxy1117/amber_fine_tune_001 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T13:46:59.201897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_001\n\n\n\nDataset automatically created during the evaluation run of model Lvxy1117/amber_fine_tune_001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:46:59.201897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_001\n\n\n\nDataset automatically created during the evaluation run of model Lvxy1117/amber_fine_tune_001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:46:59.201897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
290a4427b6a9bc392da7741814b4eff7acce02b2
# Dataset Card for "libri2Mix_test_unit" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/libri2Mix_test_unit
[ "region:us" ]
2024-01-28T13:48:48+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k_12bps", "path": "data/encodec_24k_12bps-*"}, {"split": "encodec_24k_1_5bps", "path": "data/encodec_24k_1_5bps-*"}, {"split": "encodec_24k_24bps", "path": "data/encodec_24k_24bps-*"}, {"split": "encodec_24k_3bps", "path": "data/encodec_24k_3bps-*"}, {"split": "encodec_24k_6bps", "path": "data/encodec_24k_6bps-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 16215839, "num_examples": 2000}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 16215839, "num_examples": 2000}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 24269183, "num_examples": 2000}, {"name": "audiodec_24k_320d", "num_bytes": 51773695, "num_examples": 2000}, {"name": "dac_16k", "num_bytes": 60908095, "num_examples": 2000}, {"name": "dac_24k", "num_bytes": 243839551, "num_examples": 2000}, {"name": "dac_44k", "num_bytes": 79082623, "num_examples": 2000}, {"name": "encodec_24k_12bps", "num_bytes": 97014847, "num_examples": 2000}, {"name": "encodec_24k_1_5bps", "num_bytes": 12209119, "num_examples": 2000}, {"name": "encodec_24k_24bps", "num_bytes": 193935679, "num_examples": 2000}, {"name": "encodec_24k_3bps", "num_bytes": 24324223, "num_examples": 2000}, {"name": "encodec_24k_6bps", "num_bytes": 48554431, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 129580607, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 129580607, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 129447999, "num_examples": 2000}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 65020991, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 129447999, "num_examples": 2000}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 65020991, "num_examples": 2000}, {"name": "speech_tokenizer_16k", "num_bytes": 32432511, "num_examples": 2000}], "download_size": 234832275, "dataset_size": 1548874829}}
2024-01-28T13:49:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "libri2Mix_test_unit" More Information needed
[ "# Dataset Card for \"libri2Mix_test_unit\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"libri2Mix_test_unit\"\n\nMore Information needed" ]
3e078ada4a278faac3ffb5b227c9fa9d424932b4
# Dataset Card for Evaluation run of yunconglong/13B_MATH_DPO <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [yunconglong/13B_MATH_DPO](https://huggingface.co/yunconglong/13B_MATH_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_yunconglong__13B_MATH_DPO", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T13:57:51.859773](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__13B_MATH_DPO/blob/main/results_2024-01-28T13-57-51.859773.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6520951734159913, "acc_stderr": 0.0320896342101235, "acc_norm": 0.6512213582132556, "acc_norm_stderr": 0.032773142500047994, "mc1": 0.6352509179926561, "mc1_stderr": 0.016850961061720137, "mc2": 0.786270657367768, "mc2_stderr": 0.013770581751355523 }, "harness|arc:challenge|25": { "acc": 0.7201365187713311, "acc_stderr": 0.013119040897725922, "acc_norm": 0.7465870307167235, "acc_norm_stderr": 0.012710896778378606 }, "harness|hellaswag|10": { "acc": 0.7255526787492531, "acc_stderr": 0.0044532337261103455, "acc_norm": 0.8951404102768373, "acc_norm_stderr": 0.0030574627544411952 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.03823428969926605, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.03823428969926605 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544064, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544064 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.02544636563440678, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.02544636563440678 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188723, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188723 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.031922715695483, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.031922715695483 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8860103626943006, "acc_stderr": 0.022935144053919436, "acc_norm": 0.8860103626943006, "acc_norm_stderr": 0.022935144053919436 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658752, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658752 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078962, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078962 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608306, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608306 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4446927374301676, "acc_stderr": 0.01661988198817702, "acc_norm": 0.4446927374301676, "acc_norm_stderr": 0.01661988198817702 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.02536060379624256, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.02536060379624256 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.02592237178881877, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.02592237178881877 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.02982074719142248, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.02982074719142248 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083135, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083135 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069446, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069446 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128448, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128448 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.027539122889061456, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.027539122889061456 }, "harness|truthfulqa:mc|0": { "mc1": 0.6352509179926561, "mc1_stderr": 0.016850961061720137, "mc2": 0.786270657367768, "mc2_stderr": 0.013770581751355523 }, "harness|winogrande|5": { "acc": 0.8808208366219415, "acc_stderr": 0.009105988620006186 }, "harness|gsm8k|5": { "acc": 0.6709628506444276, "acc_stderr": 0.012942375603679383 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_yunconglong__13B_MATH_DPO
[ "region:us" ]
2024-01-28T14:00:14+00:00
{"pretty_name": "Evaluation run of yunconglong/13B_MATH_DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [yunconglong/13B_MATH_DPO](https://huggingface.co/yunconglong/13B_MATH_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__13B_MATH_DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T13:57:51.859773](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__13B_MATH_DPO/blob/main/results_2024-01-28T13-57-51.859773.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520951734159913,\n \"acc_stderr\": 0.0320896342101235,\n \"acc_norm\": 0.6512213582132556,\n \"acc_norm_stderr\": 0.032773142500047994,\n \"mc1\": 0.6352509179926561,\n \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.786270657367768,\n \"mc2_stderr\": 0.013770581751355523\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7201365187713311,\n \"acc_stderr\": 0.013119040897725922,\n \"acc_norm\": 0.7465870307167235,\n \"acc_norm_stderr\": 0.012710896778378606\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7255526787492531,\n \"acc_stderr\": 0.0044532337261103455,\n \"acc_norm\": 0.8951404102768373,\n \"acc_norm_stderr\": 0.0030574627544411952\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.02536060379624256,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.02536060379624256\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083135,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083135\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.786270657367768,\n \"mc2_stderr\": 0.013770581751355523\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8808208366219415,\n \"acc_stderr\": 0.009105988620006186\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6709628506444276,\n \"acc_stderr\": 0.012942375603679383\n }\n}\n```", "repo_url": "https://huggingface.co/yunconglong/13B_MATH_DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["**/details_harness|winogrande|5_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T13-57-51.859773.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T13_57_51.859773", "path": ["results_2024-01-28T13-57-51.859773.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T13-57-51.859773.parquet"]}]}]}
2024-01-28T14:00:42+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of yunconglong/13B_MATH_DPO Dataset automatically created during the evaluation run of model yunconglong/13B_MATH_DPO on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T13:57:51.859773(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of yunconglong/13B_MATH_DPO\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/13B_MATH_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:57:51.859773(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of yunconglong/13B_MATH_DPO\n\n\n\nDataset automatically created during the evaluation run of model yunconglong/13B_MATH_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T13:57:51.859773(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
ae11ee5ae882473f9049362b744a0b6b5ce75e3d
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
francoj/test
[ "language:zh", "license:cc-by-4.0", "not-for-all-audiences", "region:us" ]
2024-01-28T14:13:37+00:00
{"language": ["zh"], "license": "cc-by-4.0", "tags": ["not-for-all-audiences"]}
2024-01-28T14:17:01+00:00
[]
[ "zh" ]
TAGS #language-Chinese #license-cc-by-4.0 #not-for-all-audiences #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#language-Chinese #license-cc-by-4.0 #not-for-all-audiences #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6e2f7c9cfd16c2ae26fc733df40225214a8763d4
# Dataset Card for "ArabNews" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
MosenA/ArabNews
[ "region:us" ]
2024-01-28T14:19:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "date", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4805526759, "num_examples": 1718929}], "download_size": 2241962021, "dataset_size": 4805526759}}
2024-01-28T14:22:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for "ArabNews" More Information needed
[ "# Dataset Card for \"ArabNews\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"ArabNews\"\n\nMore Information needed" ]
ab302055132a5d3854fc020d1cb42c66d04a3e5a
# Dataset Card for Evaluation run of sethuiyer/OpenDolphinHermes_Llama2_7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [sethuiyer/OpenDolphinHermes_Llama2_7B](https://huggingface.co/sethuiyer/OpenDolphinHermes_Llama2_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_sethuiyer__OpenDolphinHermes_Llama2_7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T14:20:43.046605](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__OpenDolphinHermes_Llama2_7B/blob/main/results_2024-01-28T14-20-43.046605.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5216936113029734, "acc_stderr": 0.0344343602407135, "acc_norm": 0.5274587758530744, "acc_norm_stderr": 0.03518833838631539, "mc1": 0.3108935128518972, "mc1_stderr": 0.016203316673559693, "mc2": 0.46099160063090105, "mc2_stderr": 0.015333190393080273 }, "harness|arc:challenge|25": { "acc": 0.507679180887372, "acc_stderr": 0.014609667440892574, "acc_norm": 0.5503412969283277, "acc_norm_stderr": 0.014537144444284732 }, "harness|hellaswag|10": { "acc": 0.5999800836486756, "acc_stderr": 0.004889007921214696, "acc_norm": 0.7873929496116312, "acc_norm_stderr": 0.004083157276012489 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.4868421052631579, "acc_stderr": 0.04067533136309174, "acc_norm": 0.4868421052631579, "acc_norm_stderr": 0.04067533136309174 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5471698113207547, "acc_stderr": 0.03063562795796182, "acc_norm": 0.5471698113207547, "acc_norm_stderr": 0.03063562795796182 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5416666666666666, "acc_stderr": 0.04166666666666665, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4624277456647399, "acc_stderr": 0.0380168510452446, "acc_norm": 0.4624277456647399, "acc_norm_stderr": 0.0380168510452446 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.46382978723404256, "acc_stderr": 0.03260038511835771, "acc_norm": 0.46382978723404256, "acc_norm_stderr": 0.03260038511835771 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5103448275862069, "acc_stderr": 0.04165774775728762, "acc_norm": 0.5103448275862069, "acc_norm_stderr": 0.04165774775728762 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.023919984164047732, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.023919984164047732 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04285714285714281, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04285714285714281 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6, "acc_stderr": 0.02786932057166463, "acc_norm": 0.6, "acc_norm_stderr": 0.02786932057166463 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.03413963805906235, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.03413963805906235 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0368105086916155, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0368105086916155 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6313131313131313, "acc_stderr": 0.034373055019806184, "acc_norm": 0.6313131313131313, "acc_norm_stderr": 0.034373055019806184 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7668393782383419, "acc_stderr": 0.03051611137147601, "acc_norm": 0.7668393782383419, "acc_norm_stderr": 0.03051611137147601 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5230769230769231, "acc_stderr": 0.025323990861736242, "acc_norm": 0.5230769230769231, "acc_norm_stderr": 0.025323990861736242 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2814814814814815, "acc_stderr": 0.027420019350945273, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.027420019350945273 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.03243718055137411, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.03243718055137411 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.037345356767871984, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.037345356767871984 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.708256880733945, "acc_stderr": 0.019489300968876522, "acc_norm": 0.708256880733945, "acc_norm_stderr": 0.019489300968876522 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.03350991604696043, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.03350991604696043 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7009803921568627, "acc_stderr": 0.03213325717373617, "acc_norm": 0.7009803921568627, "acc_norm_stderr": 0.03213325717373617 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5874439461883408, "acc_stderr": 0.03304062175449297, "acc_norm": 0.5874439461883408, "acc_norm_stderr": 0.03304062175449297 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6106870229007634, "acc_stderr": 0.04276486542814591, "acc_norm": 0.6106870229007634, "acc_norm_stderr": 0.04276486542814591 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6859504132231405, "acc_stderr": 0.042369647530410184, "acc_norm": 0.6859504132231405, "acc_norm_stderr": 0.042369647530410184 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6388888888888888, "acc_stderr": 0.046434546089062764, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.046434546089062764 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5398773006134969, "acc_stderr": 0.03915857291436971, "acc_norm": 0.5398773006134969, "acc_norm_stderr": 0.03915857291436971 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6504854368932039, "acc_stderr": 0.047211885060971716, "acc_norm": 0.6504854368932039, "acc_norm_stderr": 0.047211885060971716 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7264957264957265, "acc_stderr": 0.029202540153431173, "acc_norm": 0.7264957264957265, "acc_norm_stderr": 0.029202540153431173 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6730523627075351, "acc_stderr": 0.016774908180131463, "acc_norm": 0.6730523627075351, "acc_norm_stderr": 0.016774908180131463 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5838150289017341, "acc_stderr": 0.026538189104705484, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.026538189104705484 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24804469273743016, "acc_stderr": 0.014444157808261431, "acc_norm": 0.24804469273743016, "acc_norm_stderr": 0.014444157808261431 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5588235294117647, "acc_stderr": 0.028431095444176643, "acc_norm": 0.5588235294117647, "acc_norm_stderr": 0.028431095444176643 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6334405144694534, "acc_stderr": 0.027368078243971628, "acc_norm": 0.6334405144694534, "acc_norm_stderr": 0.027368078243971628 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5648148148148148, "acc_stderr": 0.027586006221607704, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.027586006221607704 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36524822695035464, "acc_stderr": 0.028723863853281278, "acc_norm": 0.36524822695035464, "acc_norm_stderr": 0.028723863853281278 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.40808344198174706, "acc_stderr": 0.012552598958563666, "acc_norm": 0.40808344198174706, "acc_norm_stderr": 0.012552598958563666 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4522058823529412, "acc_stderr": 0.030233758551596445, "acc_norm": 0.4522058823529412, "acc_norm_stderr": 0.030233758551596445 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.45751633986928103, "acc_stderr": 0.020154685712590884, "acc_norm": 0.45751633986928103, "acc_norm_stderr": 0.020154685712590884 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5272727272727272, "acc_stderr": 0.04782001791380061, "acc_norm": 0.5272727272727272, "acc_norm_stderr": 0.04782001791380061 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6204081632653061, "acc_stderr": 0.031067211262872468, "acc_norm": 0.6204081632653061, "acc_norm_stderr": 0.031067211262872468 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7313432835820896, "acc_stderr": 0.03134328358208954, "acc_norm": 0.7313432835820896, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-virology|5": { "acc": 0.4397590361445783, "acc_stderr": 0.03864139923699122, "acc_norm": 0.4397590361445783, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6549707602339181, "acc_stderr": 0.03645981377388806, "acc_norm": 0.6549707602339181, "acc_norm_stderr": 0.03645981377388806 }, "harness|truthfulqa:mc|0": { "mc1": 0.3108935128518972, "mc1_stderr": 0.016203316673559693, "mc2": 0.46099160063090105, "mc2_stderr": 0.015333190393080273 }, "harness|winogrande|5": { "acc": 0.7316495659037096, "acc_stderr": 0.012453340359561195 }, "harness|gsm8k|5": { "acc": 0.20166793025018953, "acc_stderr": 0.011052295889544374 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_sethuiyer__OpenDolphinHermes_Llama2_7B
[ "region:us" ]
2024-01-28T14:23:05+00:00
{"pretty_name": "Evaluation run of sethuiyer/OpenDolphinHermes_Llama2_7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [sethuiyer/OpenDolphinHermes_Llama2_7B](https://huggingface.co/sethuiyer/OpenDolphinHermes_Llama2_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sethuiyer__OpenDolphinHermes_Llama2_7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T14:20:43.046605](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__OpenDolphinHermes_Llama2_7B/blob/main/results_2024-01-28T14-20-43.046605.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5216936113029734,\n \"acc_stderr\": 0.0344343602407135,\n \"acc_norm\": 0.5274587758530744,\n \"acc_norm_stderr\": 0.03518833838631539,\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.46099160063090105,\n \"mc2_stderr\": 0.015333190393080273\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.014609667440892574,\n \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284732\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5999800836486756,\n \"acc_stderr\": 0.004889007921214696,\n \"acc_norm\": 0.7873929496116312,\n \"acc_norm_stderr\": 0.004083157276012489\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309174,\n \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309174\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.03063562795796182,\n \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.03063562795796182\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.02786932057166463,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.02786932057166463\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6313131313131313,\n \"acc_stderr\": 0.034373055019806184,\n \"acc_norm\": 0.6313131313131313,\n \"acc_norm_stderr\": 0.034373055019806184\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736242,\n \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736242\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.708256880733945,\n \"acc_stderr\": 0.019489300968876522,\n \"acc_norm\": 0.708256880733945,\n \"acc_norm_stderr\": 0.019489300968876522\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.046434546089062764,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.046434546089062764\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n \"acc_stderr\": 0.029202540153431173,\n \"acc_norm\": 0.7264957264957265,\n \"acc_norm_stderr\": 0.029202540153431173\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6730523627075351,\n \"acc_stderr\": 0.016774908180131463,\n \"acc_norm\": 0.6730523627075351,\n \"acc_norm_stderr\": 0.016774908180131463\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.026538189104705484,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.026538189104705484\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261431,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261431\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.027368078243971628,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.027368078243971628\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607704,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n \"acc_stderr\": 0.012552598958563666,\n \"acc_norm\": 0.40808344198174706,\n \"acc_norm_stderr\": 0.012552598958563666\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596445,\n \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.45751633986928103,\n \"acc_stderr\": 0.020154685712590884,\n \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.020154685712590884\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872468,\n \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872468\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.03645981377388806,\n \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.03645981377388806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3108935128518972,\n \"mc1_stderr\": 0.016203316673559693,\n \"mc2\": 0.46099160063090105,\n \"mc2_stderr\": 0.015333190393080273\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20166793025018953,\n \"acc_stderr\": 0.011052295889544374\n }\n}\n```", "repo_url": "https://huggingface.co/sethuiyer/OpenDolphinHermes_Llama2_7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|arc:challenge|25_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|gsm8k|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hellaswag|10_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T14-20-43.046605.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["**/details_harness|winogrande|5_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T14-20-43.046605.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T14_20_43.046605", "path": ["results_2024-01-28T14-20-43.046605.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T14-20-43.046605.parquet"]}]}]}
2024-01-28T14:23:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of sethuiyer/OpenDolphinHermes_Llama2_7B Dataset automatically created during the evaluation run of model sethuiyer/OpenDolphinHermes_Llama2_7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T14:20:43.046605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of sethuiyer/OpenDolphinHermes_Llama2_7B\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/OpenDolphinHermes_Llama2_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T14:20:43.046605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of sethuiyer/OpenDolphinHermes_Llama2_7B\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/OpenDolphinHermes_Llama2_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T14:20:43.046605(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
043d0c6b4346179dcd9d23cf6e6981fe91843600
# Dataset Card for "IP2P-edit-SSLWM-try-step50-7.5_1.5-200" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FelixdoingAI/IP2P-edit-SSLWM-try-step50-7.5_1.5-200
[ "region:us" ]
2024-01-28T14:29:53+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "original_prompt", "dtype": "string"}, {"name": "original_image", "dtype": "image"}, {"name": "edit_prompt", "dtype": "string"}, {"name": "edited_prompt", "dtype": "string"}, {"name": "edited_image", "dtype": "image"}, {"name": "adversarial_image", "dtype": "image"}, {"name": "edit_adv_image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 90630546.0, "num_examples": 200}], "download_size": 0, "dataset_size": 90630546.0}}
2024-01-30T09:17:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "IP2P-edit-SSLWM-try-step50-7.5_1.5-200" More Information needed
[ "# Dataset Card for \"IP2P-edit-SSLWM-try-step50-7.5_1.5-200\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"IP2P-edit-SSLWM-try-step50-7.5_1.5-200\"\n\nMore Information needed" ]
636a726fe448ecd7f9d70244924207cbafcf896d
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
WhiteAiGPT/whiteaigpt
[ "task_categories:question-answering", "task_categories:text-classification", "task_categories:text-generation", "task_categories:conversational", "size_categories:1M<n<10M", "language:en", "language:sq", "language:de", "license:apache-2.0", "biology", "chemistry", "legal", "region:us" ]
2024-01-28T14:30:19+00:00
{"language": ["en", "sq", "de"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["question-answering", "text-classification", "text-generation", "conversational"], "pretty_name": "WhiteGPT", "tags": ["biology", "chemistry", "legal"]}
2024-01-28T14:33:04+00:00
[]
[ "en", "sq", "de" ]
TAGS #task_categories-question-answering #task_categories-text-classification #task_categories-text-generation #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Albanian #language-German #license-apache-2.0 #biology #chemistry #legal #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-classification #task_categories-text-generation #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Albanian #language-German #license-apache-2.0 #biology #chemistry #legal #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
233a1dbc6c321676ed0962e2482d8675cc0d77c9
All Thierry Crouzet diary as published on his blog since 0215. Text embedding with ChatGPT text-embedding-3-large model.
tcrouzet/journal-large
[ "license:apache-2.0", "art", "writing", "doi:10.57967/hf/1705", "region:us" ]
2024-01-28T15:14:09+00:00
{"license": "apache-2.0", "tags": ["art", "writing"]}
2024-01-28T15:22:54+00:00
[]
[]
TAGS #license-apache-2.0 #art #writing #doi-10.57967/hf/1705 #region-us
All Thierry Crouzet diary as published on his blog since 0215. Text embedding with ChatGPT text-embedding-3-large model.
[]
[ "TAGS\n#license-apache-2.0 #art #writing #doi-10.57967/hf/1705 #region-us \n" ]
fb8d1551e2ea415e0b0361b8ce044bfc865fe2bc
[h-corpus](https://huggingface.co/datasets/a686d380/h-corpus-2023) 领域的 Retrieval 评价数据集。 # Leaderboard ## new/data_sample1k | Model | NDCG@5 | NDCG@10 | NDCG@15 | NDCG@20 | NDCG@30 | |-------|---------|---------|---------|---------|---------| | [IYun-large-zh](https://huggingface.co/Erin/IYun-large-zh) | 66.70±27.29 | 59.67±26.05 | 56.69±25.36 | 56.58±25.32 | 57.97±25.48 | | [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1) | 60.66±28.37 | 53.44±26.13 | 51.11±25.10 | 51.18±25.16 | 52.84±25.45 | | [Dmeta-embedding](https://huggingface.co/DMetaSoul/Dmeta-embedding) | 52.12±29.83 | 45.38±26.65 | 43.20±25.33 | 43.41±25.10 | 44.87±25.42 | | random | 0.07±1.24 | 0.09±1.01 | 0.10±0.97 | 0.12±0.99 | 0.14±1.03 | ## data_sample5k | Model | NDCG@10 | |-------|---------| | [IYun-large-zh](https://huggingface.co/Erin/IYun-large-zh) | 38.75 | | [tao-8k](https://huggingface.co/amu/tao-8k) | 38.37 | | [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1) | 35.81 | | [acge-large-zh](https://huggingface.co/aspire/acge-large-zh) | 34.26 | | [gte-large-zh](https://huggingface.co/thenlper/gte-large-zh) | 33.07 | | [PEG](https://huggingface.co/TownsWu/PEG) | 24.82 | | [Dmeta-embedding](https://huggingface.co/DMetaSoul/Dmeta-embedding) | 23.45 |
Limour/H2Retrieval
[ "language:zh", "license:cc-by-nc-sa-4.0", "region:us" ]
2024-01-28T15:47:24+00:00
{"language": ["zh"], "license": "cc-by-nc-sa-4.0"}
2024-01-31T08:34:28+00:00
[]
[ "zh" ]
TAGS #language-Chinese #license-cc-by-nc-sa-4.0 #region-us
h-corpus 领域的 Retrieval 评价数据集。 Leaderboard =========== new/data\_sample1k ------------------ data\_sample5k --------------
[]
[ "TAGS\n#language-Chinese #license-cc-by-nc-sa-4.0 #region-us \n" ]
517c792155b3d5ea8009305070c988cfe8b5a978
[视觉小说](https://huggingface.co/datasets/Limour/b-corpus) 领域的 Retrieval 评价数据集。 # Leaderboard ## data_sample2k | Model | NDCG@3 | NDCG@10 | NDCG@50 | NDCG@100 | NDCG@200 | |-------|---------|---------|---------|---------|---------| | [IYun-large-zh](https://huggingface.co/Erin/IYun-large-zh) | 80.53±20.53 | 71.40±20.87 | 52.93±21.96 | 43.40±20.72 | 34.88±18.50 | | [bce-embedding-base_v1](https://huggingface.co/maidalun1020/bce-embedding-base_v1) | 77.08±23.44 | 68.39±22.61 | 51.95±22.85 | 43.36±21.51 | 35.31±19.09 | | [Dmeta-embedding](https://huggingface.co/DMetaSoul/Dmeta-embedding) | 77.56±22.12 | 68.62±21.96 | 51.58±22.29 | 42.71±21.04 | 34.33±18.61 | | random | 01.66±05.78 | 02.06±03.66 | 02.23±02.52 | 02.13±02.03 | 02.40±01.91 |
Limour/G2Retrieval
[ "language:zh", "license:cc-by-nc-sa-4.0", "region:us" ]
2024-01-28T15:53:21+00:00
{"language": ["zh"], "license": "cc-by-nc-sa-4.0"}
2024-01-31T06:23:49+00:00
[]
[ "zh" ]
TAGS #language-Chinese #license-cc-by-nc-sa-4.0 #region-us
视觉小说 领域的 Retrieval 评价数据集。 Leaderboard =========== data\_sample2k --------------
[]
[ "TAGS\n#language-Chinese #license-cc-by-nc-sa-4.0 #region-us \n" ]
ac0a9952326dcaf28003d142a3ec1377f949dae8
# Dataset Card for "numericbench_eval" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/numericbench_eval
[ "region:us" ]
2024-01-28T15:53:51+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 330123, "num_examples": 936}], "download_size": 118358, "dataset_size": 330123}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-28T16:03:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "numericbench_eval" More Information needed
[ "# Dataset Card for \"numericbench_eval\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"numericbench_eval\"\n\nMore Information needed" ]
3c608fec1f1fafeca84ddbf666185f139d979cca
# Dataset Card for Evaluation run of paulilioaica/Hugo-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [paulilioaica/Hugo-7B-slerp](https://huggingface.co/paulilioaica/Hugo-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_paulilioaica__Hugo-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T16:18:16.274715](https://huggingface.co/datasets/open-llm-leaderboard/details_paulilioaica__Hugo-7B-slerp/blob/main/results_2024-01-28T16-18-16.274715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6267074852876857, "acc_stderr": 0.032731254967334075, "acc_norm": 0.629499889748113, "acc_norm_stderr": 0.03339162430846229, "mc1": 0.40269277845777235, "mc1_stderr": 0.01716883093518722, "mc2": 0.5712762320713095, "mc2_stderr": 0.015518920710934565 }, "harness|arc:challenge|25": { "acc": 0.5972696245733788, "acc_stderr": 0.014332236306790147, "acc_norm": 0.6450511945392492, "acc_norm_stderr": 0.013983036904094085 }, "harness|hellaswag|10": { "acc": 0.6493726349332802, "acc_stderr": 0.004761912511707509, "acc_norm": 0.8477394941246763, "acc_norm_stderr": 0.0035853896364723757 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.035834961763610736, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.035834961763610736 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337124, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337124 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.038009680605548594, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.038009680605548594 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.0372424959581773, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.0372424959581773 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.047551296160629475, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.047551296160629475 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6, "acc_stderr": 0.04082482904638628, "acc_norm": 0.6, "acc_norm_stderr": 0.04082482904638628 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43386243386243384, "acc_stderr": 0.02552503438247489, "acc_norm": 0.43386243386243384, "acc_norm_stderr": 0.02552503438247489 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.632258064516129, "acc_stderr": 0.027430866579973463, "acc_norm": 0.632258064516129, "acc_norm_stderr": 0.027430866579973463 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511656986, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494562, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494562 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306422, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306422 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6230769230769231, "acc_stderr": 0.024570975364225995, "acc_norm": 0.6230769230769231, "acc_norm_stderr": 0.024570975364225995 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.028972648884844267, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.028972648884844267 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6554621848739496, "acc_stderr": 0.030868682604121622, "acc_norm": 0.6554621848739496, "acc_norm_stderr": 0.030868682604121622 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3841059602649007, "acc_stderr": 0.03971301814719197, "acc_norm": 0.3841059602649007, "acc_norm_stderr": 0.03971301814719197 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8238532110091743, "acc_stderr": 0.016332882393431353, "acc_norm": 0.8238532110091743, "acc_norm_stderr": 0.016332882393431353 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.027599174300640766, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.027599174300640766 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7805907172995781, "acc_stderr": 0.026939106581553945, "acc_norm": 0.7805907172995781, "acc_norm_stderr": 0.026939106581553945 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6547085201793722, "acc_stderr": 0.03191100192835795, "acc_norm": 0.6547085201793722, "acc_norm_stderr": 0.03191100192835795 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.03457272836917669, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.03457272836917669 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5267857142857143, "acc_stderr": 0.047389751192741546, "acc_norm": 0.5267857142857143, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281365, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281365 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8122605363984674, "acc_stderr": 0.013964393769899134, "acc_norm": 0.8122605363984674, "acc_norm_stderr": 0.013964393769899134 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6994219653179191, "acc_stderr": 0.0246853168672578, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.0246853168672578 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4022346368715084, "acc_stderr": 0.016399716732847135, "acc_norm": 0.4022346368715084, "acc_norm_stderr": 0.016399716732847135 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729477, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729477 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885135, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885135 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6944444444444444, "acc_stderr": 0.02563082497562135, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.02563082497562135 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.02977945095730307, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.02977945095730307 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4634941329856584, "acc_stderr": 0.012736153390214963, "acc_norm": 0.4634941329856584, "acc_norm_stderr": 0.012736153390214963 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6397058823529411, "acc_stderr": 0.029163128570670733, "acc_norm": 0.6397058823529411, "acc_norm_stderr": 0.029163128570670733 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6535947712418301, "acc_stderr": 0.01924978569171721, "acc_norm": 0.6535947712418301, "acc_norm_stderr": 0.01924978569171721 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6318407960199005, "acc_stderr": 0.034104105654953025, "acc_norm": 0.6318407960199005, "acc_norm_stderr": 0.034104105654953025 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.40269277845777235, "mc1_stderr": 0.01716883093518722, "mc2": 0.5712762320713095, "mc2_stderr": 0.015518920710934565 }, "harness|winogrande|5": { "acc": 0.8003157063930545, "acc_stderr": 0.01123532838262585 }, "harness|gsm8k|5": { "acc": 0.5344958301743745, "acc_stderr": 0.013739668147545916 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_paulilioaica__Hugo-7B-slerp
[ "region:us" ]
2024-01-28T16:07:23+00:00
{"pretty_name": "Evaluation run of paulilioaica/Hugo-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [paulilioaica/Hugo-7B-slerp](https://huggingface.co/paulilioaica/Hugo-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulilioaica__Hugo-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T16:18:16.274715](https://huggingface.co/datasets/open-llm-leaderboard/details_paulilioaica__Hugo-7B-slerp/blob/main/results_2024-01-28T16-18-16.274715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6267074852876857,\n \"acc_stderr\": 0.032731254967334075,\n \"acc_norm\": 0.629499889748113,\n \"acc_norm_stderr\": 0.03339162430846229,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5712762320713095,\n \"mc2_stderr\": 0.015518920710934565\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790147,\n \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094085\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6493726349332802,\n \"acc_stderr\": 0.004761912511707509,\n \"acc_norm\": 0.8477394941246763,\n \"acc_norm_stderr\": 0.0035853896364723757\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.035834961763610736,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.035834961763610736\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638628,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638628\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n \"acc_stderr\": 0.027430866579973463,\n \"acc_norm\": 0.632258064516129,\n \"acc_norm_stderr\": 0.027430866579973463\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306422,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306422\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431353,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431353\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835795,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835795\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899134,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899134\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n \"acc_stderr\": 0.016399716732847135,\n \"acc_norm\": 0.4022346368715084,\n \"acc_norm_stderr\": 0.016399716732847135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n \"acc_stderr\": 0.034104105654953025,\n \"acc_norm\": 0.6318407960199005,\n \"acc_norm_stderr\": 0.034104105654953025\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5712762320713095,\n \"mc2_stderr\": 0.015518920710934565\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.01123532838262585\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5344958301743745,\n \"acc_stderr\": 0.013739668147545916\n }\n}\n```", "repo_url": "https://huggingface.co/paulilioaica/Hugo-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|arc:challenge|25_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|arc:challenge|25_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|arc:challenge|25_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|gsm8k|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|gsm8k|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|gsm8k|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hellaswag|10_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hellaswag|10_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hellaswag|10_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T16-05-05.675065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T16-10-37.422508.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T16-18-16.274715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["**/details_harness|winogrande|5_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["**/details_harness|winogrande|5_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["**/details_harness|winogrande|5_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T16-18-16.274715.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T16_05_05.675065", "path": ["results_2024-01-28T16-05-05.675065.parquet"]}, {"split": "2024_01_28T16_10_37.422508", "path": ["results_2024-01-28T16-10-37.422508.parquet"]}, {"split": "2024_01_28T16_18_16.274715", "path": ["results_2024-01-28T16-18-16.274715.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T16-18-16.274715.parquet"]}]}]}
2024-01-28T16:20:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of paulilioaica/Hugo-7B-slerp Dataset automatically created during the evaluation run of model paulilioaica/Hugo-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T16:18:16.274715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of paulilioaica/Hugo-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model paulilioaica/Hugo-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T16:18:16.274715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of paulilioaica/Hugo-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model paulilioaica/Hugo-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T16:18:16.274715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5f992e88d88369f9704ec19b3bca192bc9fd903b
# Dataset Card for "numeric_synth_eval" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/numeric_synth_eval
[ "region:us" ]
2024-01-28T16:12:43+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 254396, "num_examples": 1000}], "download_size": 18594, "dataset_size": 254396}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-28T16:21:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for "numeric_synth_eval" More Information needed
[ "# Dataset Card for \"numeric_synth_eval\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"numeric_synth_eval\"\n\nMore Information needed" ]
a4fc20eda939230963d43f478c29f7e6854e0dcd
# Dataset Card for "numeric_bench" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/numeric_bench
[ "region:us" ]
2024-01-28T16:12:55+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 579085.0, "num_examples": 1936}], "download_size": 142464, "dataset_size": 579085.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-28T16:21:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "numeric_bench" More Information needed
[ "# Dataset Card for \"numeric_bench\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"numeric_bench\"\n\nMore Information needed" ]
b637014c2b8dc46d1b656549a7a10380f7dd260a
# Dataset Card for "od_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
iarbel/od_dataset
[ "region:us" ]
2024-01-28T16:17:44+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "height", "dtype": "int64"}, {"name": "width", "dtype": "int64"}, {"name": "image_id", "dtype": "string"}, {"name": "objects", "struct": [{"name": "area", "sequence": "int64"}, {"name": "bbox", "sequence": {"sequence": "int64"}}, {"name": "category", "sequence": "int64"}]}], "splits": [{"name": "train", "num_bytes": 4345863.0, "num_examples": 80}, {"name": "test", "num_bytes": 1017795.0, "num_examples": 19}], "download_size": 5262915, "dataset_size": 5363658.0}}
2024-01-28T16:17:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "od_dataset" More Information needed
[ "# Dataset Card for \"od_dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"od_dataset\"\n\nMore Information needed" ]
a7d0e708423b9cdf76c952d0d0d0b6f0a1ad2c36
# Dataset Card for Evaluation run of freecs/ThetaWave-28B-v0.1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [freecs/ThetaWave-28B-v0.1](https://huggingface.co/freecs/ThetaWave-28B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_freecs__ThetaWave-28B-v0.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T16:24:41.257091](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-28B-v0.1/blob/main/results_2024-01-28T16-24-41.257091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5298897040848843, "acc_stderr": 0.03399795127047971, "acc_norm": 0.5387301326295254, "acc_norm_stderr": 0.03493833391824102, "mc1": 0.24969400244798043, "mc1_stderr": 0.015152286907148125, "mc2": 0.4986404189534564, "mc2_stderr": 0.01637095679991428 }, "harness|arc:challenge|25": { "acc": 0.3097269624573379, "acc_stderr": 0.013512058415238361, "acc_norm": 0.3660409556313993, "acc_norm_stderr": 0.014077223108470137 }, "harness|hellaswag|10": { "acc": 0.29087831109340767, "acc_stderr": 0.004532393111248689, "acc_norm": 0.35540728938458477, "acc_norm_stderr": 0.00477658353090957 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4962962962962963, "acc_stderr": 0.04319223625811331, "acc_norm": 0.4962962962962963, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6052631578947368, "acc_stderr": 0.039777499346220734, "acc_norm": 0.6052631578947368, "acc_norm_stderr": 0.039777499346220734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.54, "acc_stderr": 0.05009082659620332, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5849056603773585, "acc_stderr": 0.030325945789286105, "acc_norm": 0.5849056603773585, "acc_norm_stderr": 0.030325945789286105 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6736111111111112, "acc_stderr": 0.03921067198982266, "acc_norm": 0.6736111111111112, "acc_norm_stderr": 0.03921067198982266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5086705202312138, "acc_stderr": 0.03811890988940412, "acc_norm": 0.5086705202312138, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.04755129616062947, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.04755129616062947 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.049020713000019756, "acc_norm": 0.61, "acc_norm_stderr": 0.049020713000019756 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.49361702127659574, "acc_stderr": 0.03268335899936336, "acc_norm": 0.49361702127659574, "acc_norm_stderr": 0.03268335899936336 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.37719298245614036, "acc_stderr": 0.04559522141958216, "acc_norm": 0.37719298245614036, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4827586206896552, "acc_stderr": 0.04164188720169377, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.04164188720169377 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.373015873015873, "acc_stderr": 0.02490699045899257, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.02490699045899257 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949098, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949098 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6516129032258065, "acc_stderr": 0.02710482632810094, "acc_norm": 0.6516129032258065, "acc_norm_stderr": 0.02710482632810094 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4482758620689655, "acc_stderr": 0.03499113137676744, "acc_norm": 0.4482758620689655, "acc_norm_stderr": 0.03499113137676744 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03477691162163659, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03477691162163659 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6767676767676768, "acc_stderr": 0.033322999210706444, "acc_norm": 0.6767676767676768, "acc_norm_stderr": 0.033322999210706444 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7512953367875648, "acc_stderr": 0.031195840877700293, "acc_norm": 0.7512953367875648, "acc_norm_stderr": 0.031195840877700293 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5102564102564102, "acc_stderr": 0.025345672221942374, "acc_norm": 0.5102564102564102, "acc_norm_stderr": 0.025345672221942374 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.028133252578815635, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.028133252578815635 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5714285714285714, "acc_stderr": 0.03214536859788639, "acc_norm": 0.5714285714285714, "acc_norm_stderr": 0.03214536859788639 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696525, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7192660550458716, "acc_stderr": 0.019266055045871623, "acc_norm": 0.7192660550458716, "acc_norm_stderr": 0.019266055045871623 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3472222222222222, "acc_stderr": 0.03246887243637649, "acc_norm": 0.3472222222222222, "acc_norm_stderr": 0.03246887243637649 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.696078431372549, "acc_stderr": 0.03228210387037893, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.03228210387037893 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7215189873417721, "acc_stderr": 0.029178682304842548, "acc_norm": 0.7215189873417721, "acc_norm_stderr": 0.029178682304842548 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6053811659192825, "acc_stderr": 0.03280400504755291, "acc_norm": 0.6053811659192825, "acc_norm_stderr": 0.03280400504755291 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6564885496183206, "acc_stderr": 0.041649760719448786, "acc_norm": 0.6564885496183206, "acc_norm_stderr": 0.041649760719448786 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6446280991735537, "acc_stderr": 0.04369236326573981, "acc_norm": 0.6446280991735537, "acc_norm_stderr": 0.04369236326573981 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5185185185185185, "acc_stderr": 0.04830366024635331, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.04830366024635331 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6134969325153374, "acc_stderr": 0.038258255488486076, "acc_norm": 0.6134969325153374, "acc_norm_stderr": 0.038258255488486076 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489122, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489122 }, "harness|hendrycksTest-management|5": { "acc": 0.6601941747572816, "acc_stderr": 0.04689765937278135, "acc_norm": 0.6601941747572816, "acc_norm_stderr": 0.04689765937278135 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7692307692307693, "acc_stderr": 0.0276019213814176, "acc_norm": 0.7692307692307693, "acc_norm_stderr": 0.0276019213814176 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6794380587484036, "acc_stderr": 0.01668889331080377, "acc_norm": 0.6794380587484036, "acc_norm_stderr": 0.01668889331080377 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5404624277456648, "acc_stderr": 0.026830805998952233, "acc_norm": 0.5404624277456648, "acc_norm_stderr": 0.026830805998952233 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2837988826815642, "acc_stderr": 0.01507835897075177, "acc_norm": 0.2837988826815642, "acc_norm_stderr": 0.01507835897075177 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.630718954248366, "acc_stderr": 0.02763417668960266, "acc_norm": 0.630718954248366, "acc_norm_stderr": 0.02763417668960266 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5241157556270096, "acc_stderr": 0.028365041542564577, "acc_norm": 0.5241157556270096, "acc_norm_stderr": 0.028365041542564577 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6419753086419753, "acc_stderr": 0.0266756119260371, "acc_norm": 0.6419753086419753, "acc_norm_stderr": 0.0266756119260371 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.425531914893617, "acc_stderr": 0.02949482760014437, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.02949482760014437 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.38461538461538464, "acc_stderr": 0.012425548416302942, "acc_norm": 0.38461538461538464, "acc_norm_stderr": 0.012425548416302942 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5808823529411765, "acc_stderr": 0.02997280717046462, "acc_norm": 0.5808823529411765, "acc_norm_stderr": 0.02997280717046462 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5522875816993464, "acc_stderr": 0.020116925347422425, "acc_norm": 0.5522875816993464, "acc_norm_stderr": 0.020116925347422425 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5727272727272728, "acc_stderr": 0.04738198703545483, "acc_norm": 0.5727272727272728, "acc_norm_stderr": 0.04738198703545483 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6489795918367347, "acc_stderr": 0.03055531675557364, "acc_norm": 0.6489795918367347, "acc_norm_stderr": 0.03055531675557364 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7014925373134329, "acc_stderr": 0.032357437893550424, "acc_norm": 0.7014925373134329, "acc_norm_stderr": 0.032357437893550424 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7485380116959064, "acc_stderr": 0.033275044238468436, "acc_norm": 0.7485380116959064, "acc_norm_stderr": 0.033275044238468436 }, "harness|truthfulqa:mc|0": { "mc1": 0.24969400244798043, "mc1_stderr": 0.015152286907148125, "mc2": 0.4986404189534564, "mc2_stderr": 0.01637095679991428 }, "harness|winogrande|5": { "acc": 0.659037095501184, "acc_stderr": 0.01332268143593479 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_freecs__ThetaWave-28B-v0.1
[ "region:us" ]
2024-01-28T16:26:58+00:00
{"pretty_name": "Evaluation run of freecs/ThetaWave-28B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [freecs/ThetaWave-28B-v0.1](https://huggingface.co/freecs/ThetaWave-28B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__ThetaWave-28B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T16:24:41.257091](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-28B-v0.1/blob/main/results_2024-01-28T16-24-41.257091.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5298897040848843,\n \"acc_stderr\": 0.03399795127047971,\n \"acc_norm\": 0.5387301326295254,\n \"acc_norm_stderr\": 0.03493833391824102,\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.4986404189534564,\n \"mc2_stderr\": 0.01637095679991428\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3097269624573379,\n \"acc_stderr\": 0.013512058415238361,\n \"acc_norm\": 0.3660409556313993,\n \"acc_norm_stderr\": 0.014077223108470137\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29087831109340767,\n \"acc_stderr\": 0.004532393111248689,\n \"acc_norm\": 0.35540728938458477,\n \"acc_norm_stderr\": 0.00477658353090957\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.030325945789286105,\n \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.030325945789286105\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700293,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700293\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.03214536859788639,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03214536859788639\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871623,\n \"acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871623\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.03246887243637649,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.03246887243637649\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037893,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037893\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.038258255488486076,\n \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.038258255488486076\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.0276019213814176,\n \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.0276019213814176\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6794380587484036,\n \"acc_stderr\": 0.01668889331080377,\n \"acc_norm\": 0.6794380587484036,\n \"acc_norm_stderr\": 0.01668889331080377\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952233,\n \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952233\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n \"acc_stderr\": 0.01507835897075177,\n \"acc_norm\": 0.2837988826815642,\n \"acc_norm_stderr\": 0.01507835897075177\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.5241157556270096,\n \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.012425548416302942,\n \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.012425548416302942\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.02997280717046462,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.02997280717046462\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.032357437893550424,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.032357437893550424\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24969400244798043,\n \"mc1_stderr\": 0.015152286907148125,\n \"mc2\": 0.4986404189534564,\n \"mc2_stderr\": 0.01637095679991428\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.659037095501184,\n \"acc_stderr\": 0.01332268143593479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/freecs/ThetaWave-28B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|arc:challenge|25_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|gsm8k|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hellaswag|10_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T16-24-41.257091.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["**/details_harness|winogrande|5_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T16-24-41.257091.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T16_24_41.257091", "path": ["results_2024-01-28T16-24-41.257091.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T16-24-41.257091.parquet"]}]}]}
2024-01-28T16:27:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of freecs/ThetaWave-28B-v0.1 Dataset automatically created during the evaluation run of model freecs/ThetaWave-28B-v0.1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T16:24:41.257091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of freecs/ThetaWave-28B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-28B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T16:24:41.257091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of freecs/ThetaWave-28B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-28B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T16:24:41.257091(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b3012ee60365dd18b416ece27f256e91924eef46
# Dataset Card for "SNIPS" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/SNIPS
[ "region:us" ]
2024-01-28T16:33:24+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19197922646.0, "num_examples": 209344}, {"name": "test", "num_bytes": 1035368762.0, "num_examples": 11200}, {"name": "valid", "num_bytes": 1047359800.0, "num_examples": 11200}], "download_size": 21173943484, "dataset_size": 21280651208.0}}
2024-01-28T16:51:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "SNIPS" More Information needed
[ "# Dataset Card for \"SNIPS\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"SNIPS\"\n\nMore Information needed" ]
520feb52d6023c578c1e498f8cb6af0856983435
# HowToStep HowToStep is an automatically generated, large-scale and high-quality dataset that transforms ASR transcripts into descriptive steps by prompting the LLM and then aligns steps to the video through a two-stage determination procedure. [[project page]](https://lzq5.github.io/Video-Text-Alignment/) [[Arxiv]](https://arxiv.org/abs/2312.14055) [[GitHub]](https://github.com/Lzq5/Video-Text-Alignment) ## Analysis ![](src/analysis.png) HowToStep transforms the original transcripts of [*HTM-370K*](https://www.robots.ox.ac.uk/~vgg/research/tan/htm_sentencify_stats.html) into around 4M ordered instructional steps with start/end timestamps for almost 340K videos after filtering. As shown in figure, the average steps (sentences) per video is 10.6 and the average words per step is 8.0. ## Download We provide a tar.gz file. After decompression, each folder contains files named by `vid.pth`. ### Data Instances ``` {'vid': '_sAn5Pp9GxQ', 'start': [33, 36, 42, ..., 398], 'end': [41, 44, 50, ..., 406], 'text': ['Add pasta to boiling water.', 'Keep boiling until pasta is al dente.', 'Quinoa pasta, corn pasta, or brown rice pasta.', ..., 'Check out the creator's quick prep meal plan program for more recipe ideas.']} ``` ### Data Fields * vid (str): ID of the video. * start/end (List of int): start/end time of steps in video. * text (List of str): descriptive steps generated by large language model. ## Citation We appreciate your use of HowToStep in your work. If you find this repository helpful, please consider citing it. Feel free to contact [email protected] or open an issue if you have any questions. ```bibtex @article{li2023strong, title={A Strong Baseline for Temporal Video-Text Alignment}, author={Li, Zeqian and Chen, Qirui and Han, Tengda and Zhang, Ya and Wang, Yanfeng and Xie, Weidi}, journal={arXiv preprint arXiv:2312.14055}, year={2023} } ```
zeqianli/HowToStep
[ "arxiv:2312.14055", "region:us" ]
2024-01-28T16:43:05+00:00
{}
2024-01-31T08:46:09+00:00
[ "2312.14055" ]
[]
TAGS #arxiv-2312.14055 #region-us
# HowToStep HowToStep is an automatically generated, large-scale and high-quality dataset that transforms ASR transcripts into descriptive steps by prompting the LLM and then aligns steps to the video through a two-stage determination procedure. [[project page]](URL [[Arxiv]](URL [[GitHub]](URL ## Analysis ![](src/URL) HowToStep transforms the original transcripts of *HTM-370K* into around 4M ordered instructional steps with start/end timestamps for almost 340K videos after filtering. As shown in figure, the average steps (sentences) per video is 10.6 and the average words per step is 8.0. ## Download We provide a URL file. After decompression, each folder contains files named by 'URL'. ### Data Instances ### Data Fields * vid (str): ID of the video. * start/end (List of int): start/end time of steps in video. * text (List of str): descriptive steps generated by large language model. We appreciate your use of HowToStep in your work. If you find this repository helpful, please consider citing it. Feel free to contact lzq0103@URL or open an issue if you have any questions.
[ "# HowToStep\n\nHowToStep is an automatically generated, large-scale and high-quality dataset that transforms ASR transcripts into descriptive steps by prompting the LLM and then aligns steps to the video through a two-stage determination procedure.\n\n[[project page]](URL\n[[Arxiv]](URL\n[[GitHub]](URL", "## Analysis\n\n![](src/URL)\n\nHowToStep transforms the original transcripts of *HTM-370K* into around 4M ordered instructional steps with start/end timestamps for almost 340K videos after filtering.\nAs shown in figure, the average steps (sentences) per video is 10.6 and the average words per step is 8.0.", "## Download\n\nWe provide a URL file. After decompression, each folder contains files named by 'URL'.", "### Data Instances", "### Data Fields\n\n* vid (str): ID of the video.\n* start/end (List of int): start/end time of steps in video.\n* text (List of str): descriptive steps generated by large language model.\n\nWe appreciate your use of HowToStep in your work. If you find this repository helpful, please consider citing it. Feel free to contact lzq0103@URL or open an issue if you have any questions." ]
[ "TAGS\n#arxiv-2312.14055 #region-us \n", "# HowToStep\n\nHowToStep is an automatically generated, large-scale and high-quality dataset that transforms ASR transcripts into descriptive steps by prompting the LLM and then aligns steps to the video through a two-stage determination procedure.\n\n[[project page]](URL\n[[Arxiv]](URL\n[[GitHub]](URL", "## Analysis\n\n![](src/URL)\n\nHowToStep transforms the original transcripts of *HTM-370K* into around 4M ordered instructional steps with start/end timestamps for almost 340K videos after filtering.\nAs shown in figure, the average steps (sentences) per video is 10.6 and the average words per step is 8.0.", "## Download\n\nWe provide a URL file. After decompression, each folder contains files named by 'URL'.", "### Data Instances", "### Data Fields\n\n* vid (str): ID of the video.\n* start/end (List of int): start/end time of steps in video.\n* text (List of str): descriptive steps generated by large language model.\n\nWe appreciate your use of HowToStep in your work. If you find this repository helpful, please consider citing it. Feel free to contact lzq0103@URL or open an issue if you have any questions." ]
4f4abe3b86cfbd45236eddab8a84b7ae578fdb38
# Dataset Card for Evaluation run of AA051610/O0128 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AA051610/O0128](https://huggingface.co/AA051610/O0128) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AA051610__O0128", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T17:02:36.892419](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__O0128/blob/main/results_2024-01-28T17-02-36.892419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.8273006081081993, "acc_stderr": 0.024663470781539607, "acc_norm": 0.8335369148949214, "acc_norm_stderr": 0.025075862506569718, "mc1": 0.43451652386780903, "mc1_stderr": 0.017352738749259564, "mc2": 0.6012752268438828, "mc2_stderr": 0.014979362035595621 }, "harness|arc:challenge|25": { "acc": 0.6552901023890785, "acc_stderr": 0.01388881628678211, "acc_norm": 0.6791808873720137, "acc_norm_stderr": 0.013640943091946528 }, "harness|hellaswag|10": { "acc": 0.6569408484365664, "acc_stderr": 0.004737608340163403, "acc_norm": 0.853415654252141, "acc_norm_stderr": 0.003529682285857263 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.8222222222222222, "acc_stderr": 0.033027898599017176, "acc_norm": 0.8222222222222222, "acc_norm_stderr": 0.033027898599017176 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.9013157894736842, "acc_stderr": 0.02427022773752271, "acc_norm": 0.9013157894736842, "acc_norm_stderr": 0.02427022773752271 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.82, "acc_stderr": 0.038612291966536955, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8641509433962264, "acc_stderr": 0.02108730862243985, "acc_norm": 0.8641509433962264, "acc_norm_stderr": 0.02108730862243985 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9722222222222222, "acc_stderr": 0.013742429025504266, "acc_norm": 0.9722222222222222, "acc_norm_stderr": 0.013742429025504266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562427, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562427 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.8265895953757225, "acc_stderr": 0.02886810787497064, "acc_norm": 0.8265895953757225, "acc_norm_stderr": 0.02886810787497064 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.047551296160629475, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.047551296160629475 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.86, "acc_stderr": 0.03487350880197771, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197771 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.8468085106382979, "acc_stderr": 0.023545179061675203, "acc_norm": 0.8468085106382979, "acc_norm_stderr": 0.023545179061675203 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.7105263157894737, "acc_stderr": 0.04266339443159394, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.04266339443159394 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.8620689655172413, "acc_stderr": 0.028735632183908084, "acc_norm": 0.8620689655172413, "acc_norm_stderr": 0.028735632183908084 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.791005291005291, "acc_stderr": 0.020940481565334863, "acc_norm": 0.791005291005291, "acc_norm_stderr": 0.020940481565334863 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.6507936507936508, "acc_stderr": 0.04263906892795131, "acc_norm": 0.6507936507936508, "acc_norm_stderr": 0.04263906892795131 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9419354838709677, "acc_stderr": 0.01330413811280927, "acc_norm": 0.9419354838709677, "acc_norm_stderr": 0.01330413811280927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.7635467980295566, "acc_stderr": 0.029896114291733545, "acc_norm": 0.7635467980295566, "acc_norm_stderr": 0.029896114291733545 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.9151515151515152, "acc_stderr": 0.021759385340835914, "acc_norm": 0.9151515151515152, "acc_norm_stderr": 0.021759385340835914 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9696969696969697, "acc_stderr": 0.012213156893572809, "acc_norm": 0.9696969696969697, "acc_norm_stderr": 0.012213156893572809 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9792746113989638, "acc_stderr": 0.010281417011909029, "acc_norm": 0.9792746113989638, "acc_norm_stderr": 0.010281417011909029 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8692307692307693, "acc_stderr": 0.017094072023289646, "acc_norm": 0.8692307692307693, "acc_norm_stderr": 0.017094072023289646 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.6111111111111112, "acc_stderr": 0.029723278961476664, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.029723278961476664 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.9327731092436975, "acc_stderr": 0.016266171559293868, "acc_norm": 0.9327731092436975, "acc_norm_stderr": 0.016266171559293868 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.6291390728476821, "acc_stderr": 0.03943966699183629, "acc_norm": 0.6291390728476821, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.944954128440367, "acc_stderr": 0.009778411055200768, "acc_norm": 0.944954128440367, "acc_norm_stderr": 0.009778411055200768 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.7407407407407407, "acc_stderr": 0.02988691054762698, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.02988691054762698 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9607843137254902, "acc_stderr": 0.013623692819208832, "acc_norm": 0.9607843137254902, "acc_norm_stderr": 0.013623692819208832 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9493670886075949, "acc_stderr": 0.014271760025370185, "acc_norm": 0.9493670886075949, "acc_norm_stderr": 0.014271760025370185 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.852017937219731, "acc_stderr": 0.02383155715761354, "acc_norm": 0.852017937219731, "acc_norm_stderr": 0.02383155715761354 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.9083969465648855, "acc_stderr": 0.025300035578642962, "acc_norm": 0.9083969465648855, "acc_norm_stderr": 0.025300035578642962 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9338842975206612, "acc_stderr": 0.022683403691723312, "acc_norm": 0.9338842975206612, "acc_norm_stderr": 0.022683403691723312 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.9259259259259259, "acc_stderr": 0.025317997297209727, "acc_norm": 0.9259259259259259, "acc_norm_stderr": 0.025317997297209727 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.9325153374233128, "acc_stderr": 0.01970938281499787, "acc_norm": 0.9325153374233128, "acc_norm_stderr": 0.01970938281499787 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.7232142857142857, "acc_stderr": 0.04246624336697623, "acc_norm": 0.7232142857142857, "acc_norm_stderr": 0.04246624336697623 }, "harness|hendrycksTest-management|5": { "acc": 0.9320388349514563, "acc_stderr": 0.024919959142514464, "acc_norm": 0.9320388349514563, "acc_norm_stderr": 0.024919959142514464 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9786324786324786, "acc_stderr": 0.009473466537245874, "acc_norm": 0.9786324786324786, "acc_norm_stderr": 0.009473466537245874 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.91, "acc_stderr": 0.028762349126466136, "acc_norm": 0.91, "acc_norm_stderr": 0.028762349126466136 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9450830140485313, "acc_stderr": 0.008146760500752312, "acc_norm": 0.9450830140485313, "acc_norm_stderr": 0.008146760500752312 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.861271676300578, "acc_stderr": 0.018609859931640438, "acc_norm": 0.861271676300578, "acc_norm_stderr": 0.018609859931640438 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.8603351955307262, "acc_stderr": 0.011593340045150927, "acc_norm": 0.8603351955307262, "acc_norm_stderr": 0.011593340045150927 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.9117647058823529, "acc_stderr": 0.01624099518367418, "acc_norm": 0.9117647058823529, "acc_norm_stderr": 0.01624099518367418 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.9035369774919614, "acc_stderr": 0.016767663560541785, "acc_norm": 0.9035369774919614, "acc_norm_stderr": 0.016767663560541785 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.9012345679012346, "acc_stderr": 0.01660046080164534, "acc_norm": 0.9012345679012346, "acc_norm_stderr": 0.01660046080164534 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.7375886524822695, "acc_stderr": 0.026244920349843007, "acc_norm": 0.7375886524822695, "acc_norm_stderr": 0.026244920349843007 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.7685788787483703, "acc_stderr": 0.010771461711576476, "acc_norm": 0.7685788787483703, "acc_norm_stderr": 0.010771461711576476 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.9227941176470589, "acc_stderr": 0.016214104160827764, "acc_norm": 0.9227941176470589, "acc_norm_stderr": 0.016214104160827764 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.880718954248366, "acc_stderr": 0.013112448195110083, "acc_norm": 0.880718954248366, "acc_norm_stderr": 0.013112448195110083 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.8, "acc_stderr": 0.03831305140884601, "acc_norm": 0.8, "acc_norm_stderr": 0.03831305140884601 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8775510204081632, "acc_stderr": 0.020985477705882164, "acc_norm": 0.8775510204081632, "acc_norm_stderr": 0.020985477705882164 }, "harness|hendrycksTest-sociology|5": { "acc": 0.9601990049751243, "acc_stderr": 0.013823327352686389, "acc_norm": 0.9601990049751243, "acc_norm_stderr": 0.013823327352686389 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.95, "acc_stderr": 0.021904291355759036, "acc_norm": 0.95, "acc_norm_stderr": 0.021904291355759036 }, "harness|hendrycksTest-virology|5": { "acc": 0.6265060240963856, "acc_stderr": 0.037658451171688624, "acc_norm": 0.6265060240963856, "acc_norm_stderr": 0.037658451171688624 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.9298245614035088, "acc_stderr": 0.019591541754525123, "acc_norm": 0.9298245614035088, "acc_norm_stderr": 0.019591541754525123 }, "harness|truthfulqa:mc|0": { "mc1": 0.43451652386780903, "mc1_stderr": 0.017352738749259564, "mc2": 0.6012752268438828, "mc2_stderr": 0.014979362035595621 }, "harness|winogrande|5": { "acc": 0.8224151539068666, "acc_stderr": 0.010740676861359238 }, "harness|gsm8k|5": { "acc": 0.6846095526914329, "acc_stderr": 0.012799353675801834 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AA051610__O0128
[ "region:us" ]
2024-01-28T17:04:56+00:00
{"pretty_name": "Evaluation run of AA051610/O0128", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051610/O0128](https://huggingface.co/AA051610/O0128) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__O0128\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T17:02:36.892419](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__O0128/blob/main/results_2024-01-28T17-02-36.892419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8273006081081993,\n \"acc_stderr\": 0.024663470781539607,\n \"acc_norm\": 0.8335369148949214,\n \"acc_norm_stderr\": 0.025075862506569718,\n \"mc1\": 0.43451652386780903,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6012752268438828,\n \"mc2_stderr\": 0.014979362035595621\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946528\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6569408484365664,\n \"acc_stderr\": 0.004737608340163403,\n \"acc_norm\": 0.853415654252141,\n \"acc_norm_stderr\": 0.003529682285857263\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8222222222222222,\n \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.8222222222222222,\n \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752271,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752271\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8641509433962264,\n \"acc_stderr\": 0.02108730862243985,\n \"acc_norm\": 0.8641509433962264,\n \"acc_norm_stderr\": 0.02108730862243985\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9722222222222222,\n \"acc_stderr\": 0.013742429025504266,\n \"acc_norm\": 0.9722222222222222,\n \"acc_norm_stderr\": 0.013742429025504266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.02886810787497064,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.02886810787497064\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8468085106382979,\n \"acc_stderr\": 0.023545179061675203,\n \"acc_norm\": 0.8468085106382979,\n \"acc_norm_stderr\": 0.023545179061675203\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8620689655172413,\n \"acc_stderr\": 0.028735632183908084,\n \"acc_norm\": 0.8620689655172413,\n \"acc_norm_stderr\": 0.028735632183908084\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.791005291005291,\n \"acc_stderr\": 0.020940481565334863,\n \"acc_norm\": 0.791005291005291,\n \"acc_norm_stderr\": 0.020940481565334863\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6507936507936508,\n \"acc_stderr\": 0.04263906892795131,\n \"acc_norm\": 0.6507936507936508,\n \"acc_norm_stderr\": 0.04263906892795131\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9419354838709677,\n \"acc_stderr\": 0.01330413811280927,\n \"acc_norm\": 0.9419354838709677,\n \"acc_norm_stderr\": 0.01330413811280927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.7635467980295566,\n \"acc_stderr\": 0.029896114291733545,\n \"acc_norm\": 0.7635467980295566,\n \"acc_norm_stderr\": 0.029896114291733545\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.9151515151515152,\n \"acc_stderr\": 0.021759385340835914,\n \"acc_norm\": 0.9151515151515152,\n \"acc_norm_stderr\": 0.021759385340835914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9696969696969697,\n \"acc_stderr\": 0.012213156893572809,\n \"acc_norm\": 0.9696969696969697,\n \"acc_norm_stderr\": 0.012213156893572809\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909029,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909029\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8692307692307693,\n \"acc_stderr\": 0.017094072023289646,\n \"acc_norm\": 0.8692307692307693,\n \"acc_norm_stderr\": 0.017094072023289646\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.9327731092436975,\n \"acc_stderr\": 0.016266171559293868,\n \"acc_norm\": 0.9327731092436975,\n \"acc_norm_stderr\": 0.016266171559293868\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.6291390728476821,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.6291390728476821,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.944954128440367,\n \"acc_stderr\": 0.009778411055200768,\n \"acc_norm\": 0.944954128440367,\n \"acc_norm_stderr\": 0.009778411055200768\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02988691054762698,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02988691054762698\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9607843137254902,\n \"acc_stderr\": 0.013623692819208832,\n \"acc_norm\": 0.9607843137254902,\n \"acc_norm_stderr\": 0.013623692819208832\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370185,\n \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370185\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.852017937219731,\n \"acc_stderr\": 0.02383155715761354,\n \"acc_norm\": 0.852017937219731,\n \"acc_norm_stderr\": 0.02383155715761354\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9338842975206612,\n \"acc_stderr\": 0.022683403691723312,\n \"acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.022683403691723312\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9259259259259259,\n \"acc_stderr\": 0.025317997297209727,\n \"acc_norm\": 0.9259259259259259,\n \"acc_norm_stderr\": 0.025317997297209727\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.9325153374233128,\n \"acc_stderr\": 0.01970938281499787,\n \"acc_norm\": 0.9325153374233128,\n \"acc_norm_stderr\": 0.01970938281499787\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.7232142857142857,\n \"acc_stderr\": 0.04246624336697623,\n \"acc_norm\": 0.7232142857142857,\n \"acc_norm_stderr\": 0.04246624336697623\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9320388349514563,\n \"acc_stderr\": 0.024919959142514464,\n \"acc_norm\": 0.9320388349514563,\n \"acc_norm_stderr\": 0.024919959142514464\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9786324786324786,\n \"acc_stderr\": 0.009473466537245874,\n \"acc_norm\": 0.9786324786324786,\n \"acc_norm_stderr\": 0.009473466537245874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466136,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466136\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9450830140485313,\n \"acc_stderr\": 0.008146760500752312,\n \"acc_norm\": 0.9450830140485313,\n \"acc_norm_stderr\": 0.008146760500752312\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.861271676300578,\n \"acc_stderr\": 0.018609859931640438,\n \"acc_norm\": 0.861271676300578,\n \"acc_norm_stderr\": 0.018609859931640438\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8603351955307262,\n \"acc_stderr\": 0.011593340045150927,\n \"acc_norm\": 0.8603351955307262,\n \"acc_norm_stderr\": 0.011593340045150927\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01624099518367418,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01624099518367418\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.9035369774919614,\n \"acc_stderr\": 0.016767663560541785,\n \"acc_norm\": 0.9035369774919614,\n \"acc_norm_stderr\": 0.016767663560541785\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.9012345679012346,\n \"acc_stderr\": 0.01660046080164534,\n \"acc_norm\": 0.9012345679012346,\n \"acc_norm_stderr\": 0.01660046080164534\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.7375886524822695,\n \"acc_stderr\": 0.026244920349843007,\n \"acc_norm\": 0.7375886524822695,\n \"acc_norm_stderr\": 0.026244920349843007\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.7685788787483703,\n \"acc_stderr\": 0.010771461711576476,\n \"acc_norm\": 0.7685788787483703,\n \"acc_norm_stderr\": 0.010771461711576476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.9227941176470589,\n \"acc_stderr\": 0.016214104160827764,\n \"acc_norm\": 0.9227941176470589,\n \"acc_norm_stderr\": 0.016214104160827764\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.880718954248366,\n \"acc_stderr\": 0.013112448195110083,\n \"acc_norm\": 0.880718954248366,\n \"acc_norm_stderr\": 0.013112448195110083\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8775510204081632,\n \"acc_stderr\": 0.020985477705882164,\n \"acc_norm\": 0.8775510204081632,\n \"acc_norm_stderr\": 0.020985477705882164\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9601990049751243,\n \"acc_stderr\": 0.013823327352686389,\n \"acc_norm\": 0.9601990049751243,\n \"acc_norm_stderr\": 0.013823327352686389\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759036,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759036\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6265060240963856,\n \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.6265060240963856,\n \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9298245614035088,\n \"acc_stderr\": 0.019591541754525123,\n \"acc_norm\": 0.9298245614035088,\n \"acc_norm_stderr\": 0.019591541754525123\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6012752268438828,\n \"mc2_stderr\": 0.014979362035595621\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359238\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \"acc_stderr\": 0.012799353675801834\n }\n}\n```", "repo_url": "https://huggingface.co/AA051610/O0128", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|arc:challenge|25_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|gsm8k|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hellaswag|10_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["**/details_harness|winogrande|5_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T17-02-36.892419.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T17_02_36.892419", "path": ["results_2024-01-28T17-02-36.892419.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T17-02-36.892419.parquet"]}]}]}
2024-01-28T17:05:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AA051610/O0128 Dataset automatically created during the evaluation run of model AA051610/O0128 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T17:02:36.892419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AA051610/O0128\n\n\n\nDataset automatically created during the evaluation run of model AA051610/O0128 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T17:02:36.892419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AA051610/O0128\n\n\n\nDataset automatically created during the evaluation run of model AA051610/O0128 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T17:02:36.892419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
86fe75600a0f1ef96b4446a133749546ae851e43
# Dataset Card for Evaluation run of notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30](https://huggingface.co/notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_notadib__Mistral-7B-Instruct-v0.2-attention-sparsity-30", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T17:55:35.566677](https://huggingface.co/datasets/open-llm-leaderboard/details_notadib__Mistral-7B-Instruct-v0.2-attention-sparsity-30/blob/main/results_2024-01-28T17-55-35.566677.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6049166943557902, "acc_stderr": 0.03319909638932957, "acc_norm": 0.6094056825669654, "acc_norm_stderr": 0.0338749614846566, "mc1": 0.5238678090575275, "mc1_stderr": 0.017483547156961567, "mc2": 0.6748850342094754, "mc2_stderr": 0.01522130477706919 }, "harness|arc:challenge|25": { "acc": 0.5844709897610921, "acc_stderr": 0.014401366641216386, "acc_norm": 0.6296928327645052, "acc_norm_stderr": 0.01411129875167495 }, "harness|hellaswag|10": { "acc": 0.6633140808603863, "acc_stderr": 0.00471610647590509, "acc_norm": 0.8471420035849433, "acc_norm_stderr": 0.003591151323268345 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.562962962962963, "acc_stderr": 0.04284958639753401, "acc_norm": 0.562962962962963, "acc_norm_stderr": 0.04284958639753401 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849723, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849723 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5780346820809249, "acc_stderr": 0.0376574669386515, "acc_norm": 0.5780346820809249, "acc_norm_stderr": 0.0376574669386515 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726367, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726367 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099834, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099834 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.04615186962583703, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.04615186962583703 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6, "acc_stderr": 0.040824829046386284, "acc_norm": 0.6, "acc_norm_stderr": 0.040824829046386284 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.02525303255499769, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.02525303255499769 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6516129032258065, "acc_stderr": 0.02710482632810094, "acc_norm": 0.6516129032258065, "acc_norm_stderr": 0.02710482632810094 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.0303137105381989, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.0303137105381989 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8186528497409327, "acc_stderr": 0.02780703236068609, "acc_norm": 0.8186528497409327, "acc_norm_stderr": 0.02780703236068609 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.558974358974359, "acc_stderr": 0.025174048384000745, "acc_norm": 0.558974358974359, "acc_norm_stderr": 0.025174048384000745 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3037037037037037, "acc_stderr": 0.028037929969114993, "acc_norm": 0.3037037037037037, "acc_norm_stderr": 0.028037929969114993 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7871559633027523, "acc_stderr": 0.017549376389313694, "acc_norm": 0.7871559633027523, "acc_norm_stderr": 0.017549376389313694 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294635, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294635 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.03019028245350195, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928275, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928275 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7251908396946565, "acc_stderr": 0.03915345408847836, "acc_norm": 0.7251908396946565, "acc_norm_stderr": 0.03915345408847836 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.036401182719909476, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.036401182719909476 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650743, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650743 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.023636873317489294, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.023636873317489294 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7752234993614304, "acc_stderr": 0.014927447101937148, "acc_norm": 0.7752234993614304, "acc_norm_stderr": 0.014927447101937148 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6878612716763006, "acc_stderr": 0.024946792225272314, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.024946792225272314 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3039106145251397, "acc_stderr": 0.015382845587584518, "acc_norm": 0.3039106145251397, "acc_norm_stderr": 0.015382845587584518 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6862745098039216, "acc_stderr": 0.026568921015457138, "acc_norm": 0.6862745098039216, "acc_norm_stderr": 0.026568921015457138 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7037037037037037, "acc_stderr": 0.025407197798890165, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.025407197798890165 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.02971928127223685, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.02971928127223685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4230769230769231, "acc_stderr": 0.012618204066588389, "acc_norm": 0.4230769230769231, "acc_norm_stderr": 0.012618204066588389 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6323529411764706, "acc_stderr": 0.02928941340940319, "acc_norm": 0.6323529411764706, "acc_norm_stderr": 0.02928941340940319 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6356209150326797, "acc_stderr": 0.019469518221573705, "acc_norm": 0.6356209150326797, "acc_norm_stderr": 0.019469518221573705 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.029279567411065674, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.029279567411065674 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7562189054726368, "acc_stderr": 0.030360490154014645, "acc_norm": 0.7562189054726368, "acc_norm_stderr": 0.030360490154014645 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366255, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333047, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333047 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5238678090575275, "mc1_stderr": 0.017483547156961567, "mc2": 0.6748850342094754, "mc2_stderr": 0.01522130477706919 }, "harness|winogrande|5": { "acc": 0.7797947908445146, "acc_stderr": 0.011646276755089694 }, "harness|gsm8k|5": { "acc": 0.39423805913570886, "acc_stderr": 0.013460852357095656 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_notadib__Mistral-7B-Instruct-v0.2-attention-sparsity-30
[ "region:us" ]
2024-01-28T17:57:52+00:00
{"pretty_name": "Evaluation run of notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30", "dataset_summary": "Dataset automatically created during the evaluation run of model [notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30](https://huggingface.co/notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_notadib__Mistral-7B-Instruct-v0.2-attention-sparsity-30\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T17:55:35.566677](https://huggingface.co/datasets/open-llm-leaderboard/details_notadib__Mistral-7B-Instruct-v0.2-attention-sparsity-30/blob/main/results_2024-01-28T17-55-35.566677.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6049166943557902,\n \"acc_stderr\": 0.03319909638932957,\n \"acc_norm\": 0.6094056825669654,\n \"acc_norm_stderr\": 0.0338749614846566,\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6748850342094754,\n \"mc2_stderr\": 0.01522130477706919\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216386,\n \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6633140808603863,\n \"acc_stderr\": 0.00471610647590509,\n \"acc_norm\": 0.8471420035849433,\n \"acc_norm_stderr\": 0.003591151323268345\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294635,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294635\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.014927447101937148,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.014927447101937148\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n \"acc_stderr\": 0.015382845587584518,\n \"acc_norm\": 0.3039106145251397,\n \"acc_norm_stderr\": 0.015382845587584518\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457138,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457138\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890165,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890165\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.012618204066588389,\n \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.012618204066588389\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573705,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573705\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6748850342094754,\n \"mc2_stderr\": 0.01522130477706919\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089694\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \"acc_stderr\": 0.013460852357095656\n }\n}\n```", "repo_url": "https://huggingface.co/notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|arc:challenge|25_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|gsm8k|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hellaswag|10_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T17-55-35.566677.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["**/details_harness|winogrande|5_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T17-55-35.566677.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T17_55_35.566677", "path": ["results_2024-01-28T17-55-35.566677.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T17-55-35.566677.parquet"]}]}]}
2024-01-28T17:58:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30 Dataset automatically created during the evaluation run of model notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T17:55:35.566677(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30\n\n\n\nDataset automatically created during the evaluation run of model notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T17:55:35.566677(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30\n\n\n\nDataset automatically created during the evaluation run of model notadib/Mistral-7B-Instruct-v0.2-attention-sparsity-30 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T17:55:35.566677(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
9906254e7257df43133fba883b72efca94518485
# Dataset Card for Evaluation run of eren23/FrankenBeagle-SmallOverlap-test <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [eren23/FrankenBeagle-SmallOverlap-test](https://huggingface.co/eren23/FrankenBeagle-SmallOverlap-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_eren23__FrankenBeagle-SmallOverlap-test", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:01:48.091573](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__FrankenBeagle-SmallOverlap-test/blob/main/results_2024-01-28T18-01-48.091573.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6516048577093372, "acc_stderr": 0.03217886824872047, "acc_norm": 0.6522986567968078, "acc_norm_stderr": 0.03283383614658343, "mc1": 0.5642594859241126, "mc1_stderr": 0.017358345398863134, "mc2": 0.6969160518300113, "mc2_stderr": 0.015146787132780715 }, "harness|arc:challenge|25": { "acc": 0.6945392491467577, "acc_stderr": 0.013460080478002505, "acc_norm": 0.7201365187713311, "acc_norm_stderr": 0.01311904089772592 }, "harness|hellaswag|10": { "acc": 0.7171878111929895, "acc_stderr": 0.004494454911844622, "acc_norm": 0.8815972913762199, "acc_norm_stderr": 0.003224240722351316 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.02815283794249386, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.02815283794249386 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108102, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.04697085136647863, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.04697085136647863 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.025305906241590632, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.025305906241590632 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356853, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356853 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.02962022787479048, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.02962022787479048 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6820512820512821, "acc_stderr": 0.023610884308927865, "acc_norm": 0.6820512820512821, "acc_norm_stderr": 0.023610884308927865 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.028661201116524572, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.028661201116524572 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6596638655462185, "acc_stderr": 0.030778057422931673, "acc_norm": 0.6596638655462185, "acc_norm_stderr": 0.030778057422931673 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8366972477064221, "acc_stderr": 0.01584825580650155, "acc_norm": 0.8366972477064221, "acc_norm_stderr": 0.01584825580650155 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5555555555555556, "acc_stderr": 0.03388857118502325, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467617, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467617 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229143, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229143 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313729, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313729 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165612, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165612 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42793296089385474, "acc_stderr": 0.01654788799741611, "acc_norm": 0.42793296089385474, "acc_norm_stderr": 0.01654788799741611 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818737, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818737 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984813, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984813 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657476, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657476 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462927, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462927 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578323, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578323 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5642594859241126, "mc1_stderr": 0.017358345398863134, "mc2": 0.6969160518300113, "mc2_stderr": 0.015146787132780715 }, "harness|winogrande|5": { "acc": 0.8184688239936859, "acc_stderr": 0.01083327651500748 }, "harness|gsm8k|5": { "acc": 0.6338134950720242, "acc_stderr": 0.013270100238748835 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_eren23__FrankenBeagle-SmallOverlap-test
[ "region:us" ]
2024-01-28T18:04:04+00:00
{"pretty_name": "Evaluation run of eren23/FrankenBeagle-SmallOverlap-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [eren23/FrankenBeagle-SmallOverlap-test](https://huggingface.co/eren23/FrankenBeagle-SmallOverlap-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__FrankenBeagle-SmallOverlap-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:01:48.091573](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__FrankenBeagle-SmallOverlap-test/blob/main/results_2024-01-28T18-01-48.091573.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6516048577093372,\n \"acc_stderr\": 0.03217886824872047,\n \"acc_norm\": 0.6522986567968078,\n \"acc_norm_stderr\": 0.03283383614658343,\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.017358345398863134,\n \"mc2\": 0.6969160518300113,\n \"mc2_stderr\": 0.015146787132780715\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002505,\n \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.01311904089772592\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n \"acc_stderr\": 0.004494454911844622,\n \"acc_norm\": 0.8815972913762199,\n \"acc_norm_stderr\": 0.003224240722351316\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.017358345398863134,\n \"mc2\": 0.6969160518300113,\n \"mc2_stderr\": 0.015146787132780715\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.01083327651500748\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \"acc_stderr\": 0.013270100238748835\n }\n}\n```", "repo_url": "https://huggingface.co/eren23/FrankenBeagle-SmallOverlap-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-01-48.091573.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["**/details_harness|winogrande|5_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-01-48.091573.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_01_48.091573", "path": ["results_2024-01-28T18-01-48.091573.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-01-48.091573.parquet"]}]}]}
2024-01-28T18:04:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of eren23/FrankenBeagle-SmallOverlap-test Dataset automatically created during the evaluation run of model eren23/FrankenBeagle-SmallOverlap-test on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:01:48.091573(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of eren23/FrankenBeagle-SmallOverlap-test\n\n\n\nDataset automatically created during the evaluation run of model eren23/FrankenBeagle-SmallOverlap-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:01:48.091573(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of eren23/FrankenBeagle-SmallOverlap-test\n\n\n\nDataset automatically created during the evaluation run of model eren23/FrankenBeagle-SmallOverlap-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:01:48.091573(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a6744a06a2aeb9492cefecd1cfdefdba7c9577ff
### What is this? This is a mmap of embeddeings of clustered data in various languages and code. We will release the actual data also under cc-by-nc-2.0. ### Usage: You can use sklearn.cluster.KMeans or other clustering and semantic similarity libraries to find similarities between items.
aurora-m/mmap
[ "license:cc-by-nc-2.0", "region:us" ]
2024-01-28T18:08:06+00:00
{"license": "cc-by-nc-2.0"}
2024-02-12T23:46:29+00:00
[]
[]
TAGS #license-cc-by-nc-2.0 #region-us
### What is this? This is a mmap of embeddeings of clustered data in various languages and code. We will release the actual data also under cc-by-nc-2.0. ### Usage: You can use sklearn.cluster.KMeans or other clustering and semantic similarity libraries to find similarities between items.
[ "### What is this?\nThis is a mmap of embeddeings of clustered data in various languages and code. We will release the actual data also under cc-by-nc-2.0.", "### Usage:\nYou can use sklearn.cluster.KMeans or other clustering and semantic similarity libraries to find similarities between items." ]
[ "TAGS\n#license-cc-by-nc-2.0 #region-us \n", "### What is this?\nThis is a mmap of embeddeings of clustered data in various languages and code. We will release the actual data also under cc-by-nc-2.0.", "### Usage:\nYou can use sklearn.cluster.KMeans or other clustering and semantic similarity libraries to find similarities between items." ]
a2eb424a1240458ab1c86b9307ca7722f89c721a
# Dataset Card for Evaluation run of BlouseJury/Mistral-7B-Discord-0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BlouseJury/Mistral-7B-Discord-0.2](https://huggingface.co/BlouseJury/Mistral-7B-Discord-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BlouseJury__Mistral-7B-Discord-0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:06:21.915778](https://huggingface.co/datasets/open-llm-leaderboard/details_BlouseJury__Mistral-7B-Discord-0.2/blob/main/results_2024-01-28T18-06-21.915778.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6242801405322218, "acc_stderr": 0.03267853525645898, "acc_norm": 0.6311693770360961, "acc_norm_stderr": 0.033351034128342096, "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.42729089645764406, "mc2_stderr": 0.014384607984607447 }, "harness|arc:challenge|25": { "acc": 0.5639931740614335, "acc_stderr": 0.014491225699230918, "acc_norm": 0.60580204778157, "acc_norm_stderr": 0.014280522667467325 }, "harness|hellaswag|10": { "acc": 0.622087233618801, "acc_stderr": 0.004838747305783346, "acc_norm": 0.8249352718581956, "acc_norm_stderr": 0.003792458000523436 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353227, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353227 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7169811320754716, "acc_stderr": 0.027724236492700918, "acc_norm": 0.7169811320754716, "acc_norm_stderr": 0.027724236492700918 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05000000000000001, "acc_norm": 0.55, "acc_norm_stderr": 0.05000000000000001 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.04793724854411018, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411018 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.048786087144669955, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.048786087144669955 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.03265019475033582, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.04692008381368909, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.04692008381368909 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.025402555503260912, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.025402555503260912 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7419354838709677, "acc_stderr": 0.024892469172462836, "acc_norm": 0.7419354838709677, "acc_norm_stderr": 0.024892469172462836 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.03517945038691063, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009181, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009181 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.02833560973246336, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.02833560973246336 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015184, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015184 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6282051282051282, "acc_stderr": 0.024503472557110943, "acc_norm": 0.6282051282051282, "acc_norm_stderr": 0.024503472557110943 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059278, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059278 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257374, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257374 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8073394495412844, "acc_stderr": 0.016909276884936073, "acc_norm": 0.8073394495412844, "acc_norm_stderr": 0.016909276884936073 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5787037037037037, "acc_stderr": 0.033674621388960775, "acc_norm": 0.5787037037037037, "acc_norm_stderr": 0.033674621388960775 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.02977177522814562, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.02977177522814562 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.02747974455080851, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.02747974455080851 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516303, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6944444444444444, "acc_stderr": 0.044531975073749834, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.044531975073749834 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6993865030674846, "acc_stderr": 0.03602511318806771, "acc_norm": 0.6993865030674846, "acc_norm_stderr": 0.03602511318806771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.8155339805825242, "acc_stderr": 0.03840423627288276, "acc_norm": 0.8155339805825242, "acc_norm_stderr": 0.03840423627288276 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.02250903393707781, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.02250903393707781 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8020434227330779, "acc_stderr": 0.01424887354921758, "acc_norm": 0.8020434227330779, "acc_norm_stderr": 0.01424887354921758 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526501, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526501 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3418994413407821, "acc_stderr": 0.015864506461604654, "acc_norm": 0.3418994413407821, "acc_norm_stderr": 0.015864506461604654 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.025829163272757482, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.025829163272757482 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218894, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218894 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7037037037037037, "acc_stderr": 0.025407197798890162, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.025407197798890162 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.450354609929078, "acc_stderr": 0.02968010556502904, "acc_norm": 0.450354609929078, "acc_norm_stderr": 0.02968010556502904 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.45241199478487615, "acc_stderr": 0.012712265105889135, "acc_norm": 0.45241199478487615, "acc_norm_stderr": 0.012712265105889135 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.028582709753898445, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.028582709753898445 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6421568627450981, "acc_stderr": 0.01939305840235544, "acc_norm": 0.6421568627450981, "acc_norm_stderr": 0.01939305840235544 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784596, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.847953216374269, "acc_stderr": 0.02753912288906145, "acc_norm": 0.847953216374269, "acc_norm_stderr": 0.02753912288906145 }, "harness|truthfulqa:mc|0": { "mc1": 0.28518971848225216, "mc1_stderr": 0.015805827874454892, "mc2": 0.42729089645764406, "mc2_stderr": 0.014384607984607447 }, "harness|winogrande|5": { "acc": 0.7774269928966061, "acc_stderr": 0.011690933809712666 }, "harness|gsm8k|5": { "acc": 0.30932524639878695, "acc_stderr": 0.01273171092507812 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_BlouseJury__Mistral-7B-Discord-0.2
[ "region:us" ]
2024-01-28T18:08:41+00:00
{"pretty_name": "Evaluation run of BlouseJury/Mistral-7B-Discord-0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [BlouseJury/Mistral-7B-Discord-0.2](https://huggingface.co/BlouseJury/Mistral-7B-Discord-0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BlouseJury__Mistral-7B-Discord-0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:06:21.915778](https://huggingface.co/datasets/open-llm-leaderboard/details_BlouseJury__Mistral-7B-Discord-0.2/blob/main/results_2024-01-28T18-06-21.915778.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6242801405322218,\n \"acc_stderr\": 0.03267853525645898,\n \"acc_norm\": 0.6311693770360961,\n \"acc_norm_stderr\": 0.033351034128342096,\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.42729089645764406,\n \"mc2_stderr\": 0.014384607984607447\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230918,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.622087233618801,\n \"acc_stderr\": 0.004838747305783346,\n \"acc_norm\": 0.8249352718581956,\n \"acc_norm_stderr\": 0.003792458000523436\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05000000000000001,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05000000000000001\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110943,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110943\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257374,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257374\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814562,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814562\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.01424887354921758,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.01424887354921758\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526501,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526501\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3418994413407821,\n \"acc_stderr\": 0.015864506461604654,\n \"acc_norm\": 0.3418994413407821,\n \"acc_norm_stderr\": 0.015864506461604654\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45241199478487615,\n \"acc_stderr\": 0.012712265105889135,\n \"acc_norm\": 0.45241199478487615,\n \"acc_norm_stderr\": 0.012712265105889135\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.42729089645764406,\n \"mc2_stderr\": 0.014384607984607447\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712666\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30932524639878695,\n \"acc_stderr\": 0.01273171092507812\n }\n}\n```", "repo_url": "https://huggingface.co/BlouseJury/Mistral-7B-Discord-0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-06-21.915778.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["**/details_harness|winogrande|5_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-06-21.915778.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_06_21.915778", "path": ["results_2024-01-28T18-06-21.915778.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-06-21.915778.parquet"]}]}]}
2024-01-28T18:09:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BlouseJury/Mistral-7B-Discord-0.2 Dataset automatically created during the evaluation run of model BlouseJury/Mistral-7B-Discord-0.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:06:21.915778(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of BlouseJury/Mistral-7B-Discord-0.2\n\n\n\nDataset automatically created during the evaluation run of model BlouseJury/Mistral-7B-Discord-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:06:21.915778(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BlouseJury/Mistral-7B-Discord-0.2\n\n\n\nDataset automatically created during the evaluation run of model BlouseJury/Mistral-7B-Discord-0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:06:21.915778(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
e54433cb33819f303bd60b5cd69c0b0c5c773bc5
# Leather Dataset Welcome to the Leather Dataset, a specially curated collection of 50 images featuring used, worn, scuffed, scratched, and weathered leather. This dataset is tailored for training AI (LoRA) models. [![Discord](https://img.shields.io/discord/1091306623819059300?color=7289da&label=Discord&logo=discord&logoColor=fff&style=for-the-badge)](https://discord.com/invite/m3TBB9XEkb) ## Dataset Overview - **Content**: The dataset includes 50 high-quality images showcasing a variety of leather textures and conditions. These encompass used, worn, scuffed, scratched, and weathered appearances, providing a diverse range for AI training. - **Source**: All images are handpicked from this Unsplash collection: [Leather Items Collection on Unsplash](https://unsplash.com/collections/ch0ykOjmLq4/leather-items). - **Usage**: Designed for training AI models, especially those focusing on texture recognition and replication in leather materials. ## Licensing - The images in this dataset are licensed under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for free non-commercial use, provided that appropriate credit is given and the materials are not used for commercial purposes. - For more information on this license, please refer to the [CC BY-NC 2.0 License details](https://creativecommons.org/licenses/by-nc/2.0/). ## Dataset Composition Each image in the dataset comes with a caption generated by GPT-Vision. These captions have been specifically optimized for token shuffling during the training of AI models, enhancing their learning efficiency. The combination of meticulously selected images and intelligently crafted captions makes this dataset a valuable resource for training AI models in leather texture and condition recognition. ## How to Use the Dataset 1. **Download the Dataset**: Access and download the dataset from the provided link for your AI model training purposes. 2. **Explore Images and Captions**: Familiarize yourself with the dataset to understand the diverse range of leather textures and conditions. 3. **Train Your AI Model**: Utilize the dataset in your LoRA AI model training, leveraging the detailed captions for improved texture recognition and replication. ## Contributions and Feedback Your suggestions and contributions are highly appreciated. If you have any feedback or wish to contribute additional images to the dataset, please contact us. Your input helps in continuously improving the dataset for the AI community. ## Related https://blib.la/blog/crafting-the-future-blibla-s-ethical-approach-to-ai-model-training --- This Leather Dataset is an excellent resource for enhancing the capabilities of AI models in recognizing and replicating varied leather textures and conditions. We hope it proves to be a valuable tool in your AI development endeavors.
Blib-la/used_leather_dataset
[ "license:cc-by-nc-2.0", "region:us" ]
2024-01-28T18:09:21+00:00
{"license": "cc-by-nc-2.0", "viewer": false}
2024-02-01T10:44:26+00:00
[]
[]
TAGS #license-cc-by-nc-2.0 #region-us
# Leather Dataset Welcome to the Leather Dataset, a specially curated collection of 50 images featuring used, worn, scuffed, scratched, and weathered leather. This dataset is tailored for training AI (LoRA) models. ![Discord](URL ## Dataset Overview - Content: The dataset includes 50 high-quality images showcasing a variety of leather textures and conditions. These encompass used, worn, scuffed, scratched, and weathered appearances, providing a diverse range for AI training. - Source: All images are handpicked from this Unsplash collection: Leather Items Collection on Unsplash. - Usage: Designed for training AI models, especially those focusing on texture recognition and replication in leather materials. ## Licensing - The images in this dataset are licensed under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for free non-commercial use, provided that appropriate credit is given and the materials are not used for commercial purposes. - For more information on this license, please refer to the CC BY-NC 2.0 License details. ## Dataset Composition Each image in the dataset comes with a caption generated by GPT-Vision. These captions have been specifically optimized for token shuffling during the training of AI models, enhancing their learning efficiency. The combination of meticulously selected images and intelligently crafted captions makes this dataset a valuable resource for training AI models in leather texture and condition recognition. ## How to Use the Dataset 1. Download the Dataset: Access and download the dataset from the provided link for your AI model training purposes. 2. Explore Images and Captions: Familiarize yourself with the dataset to understand the diverse range of leather textures and conditions. 3. Train Your AI Model: Utilize the dataset in your LoRA AI model training, leveraging the detailed captions for improved texture recognition and replication. ## Contributions and Feedback Your suggestions and contributions are highly appreciated. If you have any feedback or wish to contribute additional images to the dataset, please contact us. Your input helps in continuously improving the dataset for the AI community. ## Related URL --- This Leather Dataset is an excellent resource for enhancing the capabilities of AI models in recognizing and replicating varied leather textures and conditions. We hope it proves to be a valuable tool in your AI development endeavors.
[ "# Leather Dataset\n\nWelcome to the Leather Dataset, a specially curated collection of 50 images featuring used, worn, scuffed, scratched, and weathered leather. This dataset is tailored for training AI (LoRA) models.\n\n![Discord](URL", "## Dataset Overview\n\n- Content: The dataset includes 50 high-quality images showcasing a variety of leather textures and conditions. These encompass used, worn, scuffed, scratched, and weathered appearances, providing a diverse range for AI training.\n- Source: All images are handpicked from this Unsplash collection: Leather Items Collection on Unsplash.\n- Usage: Designed for training AI models, especially those focusing on texture recognition and replication in leather materials.", "## Licensing\n\n- The images in this dataset are licensed under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for free non-commercial use, provided that appropriate credit is given and the materials are not used for commercial purposes.\n- For more information on this license, please refer to the CC BY-NC 2.0 License details.", "## Dataset Composition\n\nEach image in the dataset comes with a caption generated by GPT-Vision. These captions have been specifically optimized for token shuffling during the training of AI models, enhancing their learning efficiency. The combination of meticulously selected images and intelligently crafted captions makes this dataset a valuable resource for training AI models in leather texture and condition recognition.", "## How to Use the Dataset\n\n1. Download the Dataset: Access and download the dataset from the provided link for your AI model training purposes.\n2. Explore Images and Captions: Familiarize yourself with the dataset to understand the diverse range of leather textures and conditions.\n3. Train Your AI Model: Utilize the dataset in your LoRA AI model training, leveraging the detailed captions for improved texture recognition and replication.", "## Contributions and Feedback\n\nYour suggestions and contributions are highly appreciated. If you have any feedback or wish to contribute additional images to the dataset, please contact us. Your input helps in continuously improving the dataset for the AI community.", "## Related\n\nURL\n\n---\n\nThis Leather Dataset is an excellent resource for enhancing the capabilities of AI models in recognizing and replicating varied leather textures and conditions. We hope it proves to be a valuable tool in your AI development endeavors." ]
[ "TAGS\n#license-cc-by-nc-2.0 #region-us \n", "# Leather Dataset\n\nWelcome to the Leather Dataset, a specially curated collection of 50 images featuring used, worn, scuffed, scratched, and weathered leather. This dataset is tailored for training AI (LoRA) models.\n\n![Discord](URL", "## Dataset Overview\n\n- Content: The dataset includes 50 high-quality images showcasing a variety of leather textures and conditions. These encompass used, worn, scuffed, scratched, and weathered appearances, providing a diverse range for AI training.\n- Source: All images are handpicked from this Unsplash collection: Leather Items Collection on Unsplash.\n- Usage: Designed for training AI models, especially those focusing on texture recognition and replication in leather materials.", "## Licensing\n\n- The images in this dataset are licensed under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for free non-commercial use, provided that appropriate credit is given and the materials are not used for commercial purposes.\n- For more information on this license, please refer to the CC BY-NC 2.0 License details.", "## Dataset Composition\n\nEach image in the dataset comes with a caption generated by GPT-Vision. These captions have been specifically optimized for token shuffling during the training of AI models, enhancing their learning efficiency. The combination of meticulously selected images and intelligently crafted captions makes this dataset a valuable resource for training AI models in leather texture and condition recognition.", "## How to Use the Dataset\n\n1. Download the Dataset: Access and download the dataset from the provided link for your AI model training purposes.\n2. Explore Images and Captions: Familiarize yourself with the dataset to understand the diverse range of leather textures and conditions.\n3. Train Your AI Model: Utilize the dataset in your LoRA AI model training, leveraging the detailed captions for improved texture recognition and replication.", "## Contributions and Feedback\n\nYour suggestions and contributions are highly appreciated. If you have any feedback or wish to contribute additional images to the dataset, please contact us. Your input helps in continuously improving the dataset for the AI community.", "## Related\n\nURL\n\n---\n\nThis Leather Dataset is an excellent resource for enhancing the capabilities of AI models in recognizing and replicating varied leather textures and conditions. We hope it proves to be a valuable tool in your AI development endeavors." ]
5e68b247243b3c23273f8fd9e1924ebd4d746d93
# Dataset Card for Evaluation run of eren23/NeuralDareBeagle-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [eren23/NeuralDareBeagle-7B-slerp](https://huggingface.co/eren23/NeuralDareBeagle-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:11:46.511504](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp/blob/main/results_2024-01-28T18-11-46.511504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6554392261924249, "acc_stderr": 0.03212679462957801, "acc_norm": 0.6550602589470452, "acc_norm_stderr": 0.032794301399577036, "mc1": 0.5581395348837209, "mc1_stderr": 0.01738476747898621, "mc2": 0.6918000534624221, "mc2_stderr": 0.014976389591941985 }, "harness|arc:challenge|25": { "acc": 0.6953924914675768, "acc_stderr": 0.01344952210993249, "acc_norm": 0.7209897610921502, "acc_norm_stderr": 0.013106784883601333 }, "harness|hellaswag|10": { "acc": 0.7094204341764588, "acc_stderr": 0.004531019159414108, "acc_norm": 0.8819956184027086, "acc_norm_stderr": 0.0032195397905004732 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.720754716981132, "acc_stderr": 0.027611163402399715, "acc_norm": 0.720754716981132, "acc_norm_stderr": 0.027611163402399715 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.45098039215686275, "acc_stderr": 0.049512182523962625, "acc_norm": 0.45098039215686275, "acc_norm_stderr": 0.049512182523962625 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944433, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.46825396825396826, "acc_stderr": 0.04463112720677172, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268545, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268545 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603348, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603348 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3296296296296296, "acc_stderr": 0.02866120111652457, "acc_norm": 0.3296296296296296, "acc_norm_stderr": 0.02866120111652457 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.03398110890294636, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931796, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931796 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7890295358649789, "acc_stderr": 0.026558372502661916, "acc_norm": 0.7890295358649789, "acc_norm_stderr": 0.026558372502661916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507332, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507332 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993457, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993457 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.024105712607754307, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.024105712607754307 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4335195530726257, "acc_stderr": 0.016574027219517635, "acc_norm": 0.4335195530726257, "acc_norm_stderr": 0.016574027219517635 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7091503267973857, "acc_stderr": 0.02600480036395213, "acc_norm": 0.7091503267973857, "acc_norm_stderr": 0.02600480036395213 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47327249022164275, "acc_stderr": 0.01275197796767601, "acc_norm": 0.47327249022164275, "acc_norm_stderr": 0.01275197796767601 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.02824568739146293, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.02824568739146293 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623327, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685516, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685516 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5581395348837209, "mc1_stderr": 0.01738476747898621, "mc2": 0.6918000534624221, "mc2_stderr": 0.014976389591941985 }, "harness|winogrande|5": { "acc": 0.8255722178374112, "acc_stderr": 0.010665187902498435 }, "harness|gsm8k|5": { "acc": 0.7058377558756633, "acc_stderr": 0.012551285331470152 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp
[ "region:us" ]
2024-01-28T18:14:06+00:00
{"pretty_name": "Evaluation run of eren23/NeuralDareBeagle-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [eren23/NeuralDareBeagle-7B-slerp](https://huggingface.co/eren23/NeuralDareBeagle-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:11:46.511504](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__NeuralDareBeagle-7B-slerp/blob/main/results_2024-01-28T18-11-46.511504.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554392261924249,\n \"acc_stderr\": 0.03212679462957801,\n \"acc_norm\": 0.6550602589470452,\n \"acc_norm_stderr\": 0.032794301399577036,\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6918000534624221,\n \"mc2_stderr\": 0.014976389591941985\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.01344952210993249,\n \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601333\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7094204341764588,\n \"acc_stderr\": 0.004531019159414108,\n \"acc_norm\": 0.8819956184027086,\n \"acc_norm_stderr\": 0.0032195397905004732\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.6918000534624221,\n \"mc2_stderr\": 0.014976389591941985\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498435\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \"acc_stderr\": 0.012551285331470152\n }\n}\n```", "repo_url": "https://huggingface.co/eren23/NeuralDareBeagle-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["**/details_harness|winogrande|5_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-11-46.511504.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_11_46.511504", "path": ["results_2024-01-28T18-11-46.511504.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-11-46.511504.parquet"]}]}]}
2024-01-28T18:14:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of eren23/NeuralDareBeagle-7B-slerp Dataset automatically created during the evaluation run of model eren23/NeuralDareBeagle-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:11:46.511504(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of eren23/NeuralDareBeagle-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model eren23/NeuralDareBeagle-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:11:46.511504(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of eren23/NeuralDareBeagle-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model eren23/NeuralDareBeagle-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:11:46.511504(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c28ecaabcbdd2b713e856c4517e538735affd0f7
# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache](https://huggingface.co/saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:15:46.929459](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache/blob/main/results_2024-01-28T18-15-46.929459.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23209302149061675, "acc_stderr": 0.02992707807148158, "acc_norm": 0.23163380863022787, "acc_norm_stderr": 0.03071316471647595, "mc1": 0.23011015911872704, "mc1_stderr": 0.014734557959807763, "mc2": 0.48987576662324334, "mc2_stderr": 0.016135847085052512 }, "harness|arc:challenge|25": { "acc": 0.2022184300341297, "acc_stderr": 0.011737454431872104, "acc_norm": 0.23037542662116042, "acc_norm_stderr": 0.01230492841874761 }, "harness|hellaswag|10": { "acc": 0.25951005775741887, "acc_stderr": 0.004374699189284863, "acc_norm": 0.25941047600079664, "acc_norm_stderr": 0.004374153847826759 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.23011015911872704, "mc1_stderr": 0.014734557959807763, "mc2": 0.48987576662324334, "mc2_stderr": 0.016135847085052512 }, "harness|winogrande|5": { "acc": 0.5193370165745856, "acc_stderr": 0.01404197273371297 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache
[ "region:us" ]
2024-01-28T18:17:30+00:00
{"pretty_name": "Evaluation run of saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache", "dataset_summary": "Dataset automatically created during the evaluation run of model [saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache](https://huggingface.co/saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:15:46.929459](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache/blob/main/results_2024-01-28T18-15-46.929459.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23209302149061675,\n \"acc_stderr\": 0.02992707807148158,\n \"acc_norm\": 0.23163380863022787,\n \"acc_norm_stderr\": 0.03071316471647595,\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.48987576662324334,\n \"mc2_stderr\": 0.016135847085052512\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2022184300341297,\n \"acc_stderr\": 0.011737454431872104,\n \"acc_norm\": 0.23037542662116042,\n \"acc_norm_stderr\": 0.01230492841874761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25951005775741887,\n \"acc_stderr\": 0.004374699189284863,\n \"acc_norm\": 0.25941047600079664,\n \"acc_norm_stderr\": 0.004374153847826759\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.48987576662324334,\n \"mc2_stderr\": 0.016135847085052512\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5193370165745856,\n \"acc_stderr\": 0.01404197273371297\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-15-46.929459.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["**/details_harness|winogrande|5_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-15-46.929459.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_15_46.929459", "path": ["results_2024-01-28T18-15-46.929459.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-15-46.929459.parquet"]}]}]}
2024-01-28T18:17:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache Dataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:15:46.929459(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache\n\n\n\nDataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:15:46.929459(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache\n\n\n\nDataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-gqa-ub-16-best-for-KV-cache on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:15:46.929459(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
43b882d116a0bac2d131101fae7a34083286cd96
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-v1](https://huggingface.co/CultriX/Wernicke-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CultriX__Wernicke-7B-v1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:28:02.657512](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v1/blob/main/results_2024-01-28T18-28-02.657512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6553586443001213, "acc_stderr": 0.0320108832818923, "acc_norm": 0.6549223520273856, "acc_norm_stderr": 0.032676687569058165, "mc1": 0.572827417380661, "mc1_stderr": 0.017316834410963915, "mc2": 0.7095040229828242, "mc2_stderr": 0.014900810460710302 }, "harness|arc:challenge|25": { "acc": 0.7056313993174061, "acc_stderr": 0.01331852846053942, "acc_norm": 0.7320819112627986, "acc_norm_stderr": 0.012942030195136444 }, "harness|hellaswag|10": { "acc": 0.714299940250946, "acc_stderr": 0.004508239594503833, "acc_norm": 0.8847839075881299, "acc_norm_stderr": 0.0031863002304505757 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.02783491252754407, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.02783491252754407 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6763005780346821, "acc_stderr": 0.035676037996391706, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.035676037996391706 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944427, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944427 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083525, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033477, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033477 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.02857834836547308, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.02857834836547308 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.030684737115135367, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.030684737115135367 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.034076320938540516, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.034076320938540516 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467618, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467618 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.031024411740572213, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.031024411740572213 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159465, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159465 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.02023714900899093, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.02023714900899093 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608311, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608311 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069367, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069367 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4245810055865922, "acc_stderr": 0.016531170993278884, "acc_norm": 0.4245810055865922, "acc_norm_stderr": 0.016531170993278884 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6993464052287581, "acc_stderr": 0.02625605383571896, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.02625605383571896 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7407407407407407, "acc_stderr": 0.02438366553103545, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.02438366553103545 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47196870925684486, "acc_stderr": 0.012750151802922433, "acc_norm": 0.47196870925684486, "acc_norm_stderr": 0.012750151802922433 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.028332959514031208, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.028332959514031208 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685517, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685517 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.572827417380661, "mc1_stderr": 0.017316834410963915, "mc2": 0.7095040229828242, "mc2_stderr": 0.014900810460710302 }, "harness|winogrande|5": { "acc": 0.8374112075769534, "acc_stderr": 0.01037045555134333 }, "harness|gsm8k|5": { "acc": 0.6959818043972706, "acc_stderr": 0.012670420440198673 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_CultriX__Wernicke-7B-v1
[ "region:us" ]
2024-01-28T18:30:24+00:00
{"pretty_name": "Evaluation run of CultriX/Wernicke-7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-v1](https://huggingface.co/CultriX/Wernicke-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__Wernicke-7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:28:02.657512](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v1/blob/main/results_2024-01-28T18-28-02.657512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6553586443001213,\n \"acc_stderr\": 0.0320108832818923,\n \"acc_norm\": 0.6549223520273856,\n \"acc_norm_stderr\": 0.032676687569058165,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963915,\n \"mc2\": 0.7095040229828242,\n \"mc2_stderr\": 0.014900810460710302\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.01331852846053942,\n \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136444\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714299940250946,\n \"acc_stderr\": 0.004508239594503833,\n \"acc_norm\": 0.8847839075881299,\n \"acc_norm_stderr\": 0.0031863002304505757\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135367,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135367\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n \"acc_stderr\": 0.016531170993278884,\n \"acc_norm\": 0.4245810055865922,\n \"acc_norm_stderr\": 0.016531170993278884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.012750151802922433,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.012750151802922433\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963915,\n \"mc2\": 0.7095040229828242,\n \"mc2_stderr\": 0.014900810460710302\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.01037045555134333\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \"acc_stderr\": 0.012670420440198673\n }\n}\n```", "repo_url": "https://huggingface.co/CultriX/Wernicke-7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-28-02.657512.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["**/details_harness|winogrande|5_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-28-02.657512.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_28_02.657512", "path": ["results_2024-01-28T18-28-02.657512.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-28-02.657512.parquet"]}]}]}
2024-01-28T18:30:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v1 Dataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:28:02.657512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v1\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:28:02.657512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v1\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:28:02.657512(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
763668abb1ad37050bd4eb314bc956410c2f31db
# Dataset Card for Evaluation run of tourist800/Marcoro14-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [tourist800/Marcoro14-7B-slerp](https://huggingface.co/tourist800/Marcoro14-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:32:24.206889](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp/blob/main/results_2024-01-28T18-32-24.206889.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6111388777526852, "acc_stderr": 0.03287383799644916, "acc_norm": 0.6159662135212005, "acc_norm_stderr": 0.03353760847602086, "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836882, "mc2": 0.5207873423930568, "mc2_stderr": 0.0153889376471881 }, "harness|arc:challenge|25": { "acc": 0.5861774744027304, "acc_stderr": 0.014392730009221004, "acc_norm": 0.6339590443686007, "acc_norm_stderr": 0.014077223108470137 }, "harness|hellaswag|10": { "acc": 0.6421031666998606, "acc_stderr": 0.004784018497679814, "acc_norm": 0.8376817367058355, "acc_norm_stderr": 0.0036798891253998155 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.02898545565233439, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.02898545565233439 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5063829787234042, "acc_stderr": 0.03268335899936337, "acc_norm": 0.5063829787234042, "acc_norm_stderr": 0.03268335899936337 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.046151869625837026, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.046151869625837026 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.040703290137070705, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.040703290137070705 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055266, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055266 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.0437588849272706, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.0437588849272706 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5741935483870968, "acc_stderr": 0.028129112709165897, "acc_norm": 0.5741935483870968, "acc_norm_stderr": 0.028129112709165897 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306433, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5743589743589743, "acc_stderr": 0.025069094387296532, "acc_norm": 0.5743589743589743, "acc_norm_stderr": 0.025069094387296532 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.028133252578815632, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.028133252578815632 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.03135709599613591, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.03135709599613591 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8018348623853211, "acc_stderr": 0.017090573804217902, "acc_norm": 0.8018348623853211, "acc_norm_stderr": 0.017090573804217902 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588663, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588663 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.02616056824660146, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.02616056824660146 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.032521134899291884, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.032521134899291884 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8347107438016529, "acc_stderr": 0.03390780612972776, "acc_norm": 0.8347107438016529, "acc_norm_stderr": 0.03390780612972776 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243838, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6993865030674846, "acc_stderr": 0.03602511318806771, "acc_norm": 0.6993865030674846, "acc_norm_stderr": 0.03602511318806771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073325, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073325 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8033205619412516, "acc_stderr": 0.014214138556913915, "acc_norm": 0.8033205619412516, "acc_norm_stderr": 0.014214138556913915 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6907514450867052, "acc_stderr": 0.024883140570071762, "acc_norm": 0.6907514450867052, "acc_norm_stderr": 0.024883140570071762 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41675977653631285, "acc_stderr": 0.016489134962438954, "acc_norm": 0.41675977653631285, "acc_norm_stderr": 0.016489134962438954 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6993464052287581, "acc_stderr": 0.02625605383571896, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.02625605383571896 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153266, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.025483115601195455, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.025483115601195455 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236848, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236848 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4406779661016949, "acc_stderr": 0.012680037994097074, "acc_norm": 0.4406779661016949, "acc_norm_stderr": 0.012680037994097074 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.02934980313976587, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.02934980313976587 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6372549019607843, "acc_stderr": 0.019450768432505518, "acc_norm": 0.6372549019607843, "acc_norm_stderr": 0.019450768432505518 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6979591836734694, "acc_stderr": 0.0293936093198798, "acc_norm": 0.6979591836734694, "acc_norm_stderr": 0.0293936093198798 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6019900497512438, "acc_stderr": 0.034611994290400135, "acc_norm": 0.6019900497512438, "acc_norm_stderr": 0.034611994290400135 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836882, "mc2": 0.5207873423930568, "mc2_stderr": 0.0153889376471881 }, "harness|winogrande|5": { "acc": 0.7790055248618785, "acc_stderr": 0.01166122363764341 }, "harness|gsm8k|5": { "acc": 0.40181956027293403, "acc_stderr": 0.01350435778749403 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp
[ "region:us" ]
2024-01-28T18:34:45+00:00
{"pretty_name": "Evaluation run of tourist800/Marcoro14-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [tourist800/Marcoro14-7B-slerp](https://huggingface.co/tourist800/Marcoro14-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:32:24.206889](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp/blob/main/results_2024-01-28T18-32-24.206889.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6111388777526852,\n \"acc_stderr\": 0.03287383799644916,\n \"acc_norm\": 0.6159662135212005,\n \"acc_norm_stderr\": 0.03353760847602086,\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5207873423930568,\n \"mc2_stderr\": 0.0153889376471881\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221004,\n \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.014077223108470137\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6421031666998606,\n \"acc_stderr\": 0.004784018497679814,\n \"acc_norm\": 0.8376817367058355,\n \"acc_norm_stderr\": 0.0036798891253998155\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936337,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165897,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165897\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n \"acc_stderr\": 0.012680037994097074,\n \"acc_norm\": 0.4406779661016949,\n \"acc_norm_stderr\": 0.012680037994097074\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505518,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505518\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n \"acc_stderr\": 0.034611994290400135,\n \"acc_norm\": 0.6019900497512438,\n \"acc_norm_stderr\": 0.034611994290400135\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5207873423930568,\n \"mc2_stderr\": 0.0153889376471881\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \"acc_stderr\": 0.01350435778749403\n }\n}\n```", "repo_url": "https://huggingface.co/tourist800/Marcoro14-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["**/details_harness|winogrande|5_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-32-24.206889.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_32_24.206889", "path": ["results_2024-01-28T18-32-24.206889.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-32-24.206889.parquet"]}]}]}
2024-01-28T18:35:07+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of tourist800/Marcoro14-7B-slerp Dataset automatically created during the evaluation run of model tourist800/Marcoro14-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:32:24.206889(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of tourist800/Marcoro14-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model tourist800/Marcoro14-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:32:24.206889(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of tourist800/Marcoro14-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model tourist800/Marcoro14-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:32:24.206889(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
3d439c496849859b06d512eceadad496c45cab86
# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss](https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:39:03.167001](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss/blob/main/results_2024-01-28T18-39-03.167001.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.232222236603846, "acc_stderr": 0.029944487420180275, "acc_norm": 0.23185623000454325, "acc_norm_stderr": 0.030730825589463752, "mc1": 0.21909424724602203, "mc1_stderr": 0.01448003857875745, "mc2": 0.4680726486198067, "mc2_stderr": 0.016052523463533863 }, "harness|arc:challenge|25": { "acc": 0.19112627986348124, "acc_stderr": 0.011490055292778589, "acc_norm": 0.2167235494880546, "acc_norm_stderr": 0.01204015671348119 }, "harness|hellaswag|10": { "acc": 0.2658832901812388, "acc_stderr": 0.004408994868650102, "acc_norm": 0.2664807807209719, "acc_norm_stderr": 0.00441214941571792 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.1724137931034483, "acc_stderr": 0.02657767218303658, "acc_norm": 0.1724137931034483, "acc_norm_stderr": 0.02657767218303658 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.21909424724602203, "mc1_stderr": 0.01448003857875745, "mc2": 0.4680726486198067, "mc2_stderr": 0.016052523463533863 }, "harness|winogrande|5": { "acc": 0.5122336227308603, "acc_stderr": 0.01404827882040562 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss
[ "region:us" ]
2024-01-28T18:40:46+00:00
{"pretty_name": "Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss", "dataset_summary": "Dataset automatically created during the evaluation run of model [saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss](https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:39:03.167001](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss/blob/main/results_2024-01-28T18-39-03.167001.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.232222236603846,\n \"acc_stderr\": 0.029944487420180275,\n \"acc_norm\": 0.23185623000454325,\n \"acc_norm_stderr\": 0.030730825589463752,\n \"mc1\": 0.21909424724602203,\n \"mc1_stderr\": 0.01448003857875745,\n \"mc2\": 0.4680726486198067,\n \"mc2_stderr\": 0.016052523463533863\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19112627986348124,\n \"acc_stderr\": 0.011490055292778589,\n \"acc_norm\": 0.2167235494880546,\n \"acc_norm_stderr\": 0.01204015671348119\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2658832901812388,\n \"acc_stderr\": 0.004408994868650102,\n \"acc_norm\": 0.2664807807209719,\n \"acc_norm_stderr\": 0.00441214941571792\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.02657767218303658,\n \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.02657767218303658\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n \"mc1_stderr\": 0.01448003857875745,\n \"mc2\": 0.4680726486198067,\n \"mc2_stderr\": 0.016052523463533863\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5122336227308603,\n \"acc_stderr\": 0.01404827882040562\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-39-03.167001.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["**/details_harness|winogrande|5_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-39-03.167001.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_39_03.167001", "path": ["results_2024-01-28T18-39-03.167001.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-39-03.167001.parquet"]}]}]}
2024-01-28T18:41:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss Dataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:39:03.167001(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss\n\n\n\nDataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:39:03.167001(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss\n\n\n\nDataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-q-loss on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:39:03.167001(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
30b31443d52382b3c2eec4d7d0979ab20a13a5b5
# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache](https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:43:29.335129](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache/blob/main/results_2024-01-28T18-43-29.335129.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2316811015320164, "acc_stderr": 0.029921145723277393, "acc_norm": 0.2319513129670488, "acc_norm_stderr": 0.030716547854869283, "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731606, "mc2": 0.4668597319189143, "mc2_stderr": 0.016229462983418045 }, "harness|arc:challenge|25": { "acc": 0.18515358361774745, "acc_stderr": 0.011350774438389699, "acc_norm": 0.2380546075085324, "acc_norm_stderr": 0.0124457700280262 }, "harness|hellaswag|10": { "acc": 0.26249751045608444, "acc_stderr": 0.004390923353200559, "acc_norm": 0.2704640509858594, "acc_norm_stderr": 0.004432917403755056 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.2350061199510404, "mc1_stderr": 0.014843061507731606, "mc2": 0.4668597319189143, "mc2_stderr": 0.016229462983418045 }, "harness|winogrande|5": { "acc": 0.5082872928176796, "acc_stderr": 0.014050555322824189 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache
[ "region:us" ]
2024-01-28T18:45:11+00:00
{"pretty_name": "Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache", "dataset_summary": "Dataset automatically created during the evaluation run of model [saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache](https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:43:29.335129](https://huggingface.co/datasets/open-llm-leaderboard/details_saarvajanik__facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache/blob/main/results_2024-01-28T18-43-29.335129.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2316811015320164,\n \"acc_stderr\": 0.029921145723277393,\n \"acc_norm\": 0.2319513129670488,\n \"acc_norm_stderr\": 0.030716547854869283,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731606,\n \"mc2\": 0.4668597319189143,\n \"mc2_stderr\": 0.016229462983418045\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.18515358361774745,\n \"acc_stderr\": 0.011350774438389699,\n \"acc_norm\": 0.2380546075085324,\n \"acc_norm_stderr\": 0.0124457700280262\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26249751045608444,\n \"acc_stderr\": 0.004390923353200559,\n \"acc_norm\": 0.2704640509858594,\n \"acc_norm_stderr\": 0.004432917403755056\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731606,\n \"mc2\": 0.4668597319189143,\n \"mc2_stderr\": 0.016229462983418045\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5082872928176796,\n \"acc_stderr\": 0.014050555322824189\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["**/details_harness|winogrande|5_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-43-29.335129.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_43_29.335129", "path": ["results_2024-01-28T18-43-29.335129.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-43-29.335129.parquet"]}]}]}
2024-01-28T18:45:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache Dataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:43:29.335129(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache\n\n\n\nDataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:43:29.335129(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache\n\n\n\nDataset automatically created during the evaluation run of model saarvajanik/facebook-opt-6.7b-qcqa-ub-16-best-for-KV-cache on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:43:29.335129(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
72c2a13495d5170e95ada1a63956c8f93e806d2b
# Dataset Card for Evaluation run of tourist800/mistral_2X7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [tourist800/mistral_2X7b](https://huggingface.co/tourist800/mistral_2X7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_tourist800__mistral_2X7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T18:43:40.962065](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__mistral_2X7b/blob/main/results_2024-01-28T18-43-40.962065.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6111388777526852, "acc_stderr": 0.03287383799644916, "acc_norm": 0.6159662135212005, "acc_norm_stderr": 0.03353760847602086, "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836882, "mc2": 0.5207873423930568, "mc2_stderr": 0.0153889376471881 }, "harness|arc:challenge|25": { "acc": 0.5861774744027304, "acc_stderr": 0.014392730009221004, "acc_norm": 0.6339590443686007, "acc_norm_stderr": 0.014077223108470137 }, "harness|hellaswag|10": { "acc": 0.6421031666998606, "acc_stderr": 0.004784018497679814, "acc_norm": 0.8376817367058355, "acc_norm_stderr": 0.0036798891253998155 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.037385206761196686, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.037385206761196686 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6679245283018868, "acc_stderr": 0.02898545565233439, "acc_norm": 0.6679245283018868, "acc_norm_stderr": 0.02898545565233439 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5063829787234042, "acc_stderr": 0.03268335899936337, "acc_norm": 0.5063829787234042, "acc_norm_stderr": 0.03268335899936337 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.046151869625837026, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.046151869625837026 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.040703290137070705, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.040703290137070705 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055266, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055266 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3968253968253968, "acc_stderr": 0.0437588849272706, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.0437588849272706 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5741935483870968, "acc_stderr": 0.028129112709165897, "acc_norm": 0.5741935483870968, "acc_norm_stderr": 0.028129112709165897 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306433, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306433 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5743589743589743, "acc_stderr": 0.025069094387296532, "acc_norm": 0.5743589743589743, "acc_norm_stderr": 0.025069094387296532 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.028133252578815632, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.028133252578815632 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.03135709599613591, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.03135709599613591 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8018348623853211, "acc_stderr": 0.017090573804217902, "acc_norm": 0.8018348623853211, "acc_norm_stderr": 0.017090573804217902 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7941176470588235, "acc_stderr": 0.028379449451588663, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.028379449451588663 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.02616056824660146, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.02616056824660146 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.032521134899291884, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.032521134899291884 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8347107438016529, "acc_stderr": 0.03390780612972776, "acc_norm": 0.8347107438016529, "acc_norm_stderr": 0.03390780612972776 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7592592592592593, "acc_stderr": 0.04133119440243838, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.04133119440243838 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6993865030674846, "acc_stderr": 0.03602511318806771, "acc_norm": 0.6993865030674846, "acc_norm_stderr": 0.03602511318806771 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073325, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073325 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8033205619412516, "acc_stderr": 0.014214138556913915, "acc_norm": 0.8033205619412516, "acc_norm_stderr": 0.014214138556913915 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6907514450867052, "acc_stderr": 0.024883140570071762, "acc_norm": 0.6907514450867052, "acc_norm_stderr": 0.024883140570071762 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41675977653631285, "acc_stderr": 0.016489134962438954, "acc_norm": 0.41675977653631285, "acc_norm_stderr": 0.016489134962438954 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6993464052287581, "acc_stderr": 0.02625605383571896, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.02625605383571896 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153266, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.025483115601195455, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.025483115601195455 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4574468085106383, "acc_stderr": 0.029719281272236848, "acc_norm": 0.4574468085106383, "acc_norm_stderr": 0.029719281272236848 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4406779661016949, "acc_stderr": 0.012680037994097074, "acc_norm": 0.4406779661016949, "acc_norm_stderr": 0.012680037994097074 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.02934980313976587, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.02934980313976587 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6372549019607843, "acc_stderr": 0.019450768432505518, "acc_norm": 0.6372549019607843, "acc_norm_stderr": 0.019450768432505518 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6979591836734694, "acc_stderr": 0.0293936093198798, "acc_norm": 0.6979591836734694, "acc_norm_stderr": 0.0293936093198798 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6019900497512438, "acc_stderr": 0.034611994290400135, "acc_norm": 0.6019900497512438, "acc_norm_stderr": 0.034611994290400135 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836882, "mc2": 0.5207873423930568, "mc2_stderr": 0.0153889376471881 }, "harness|winogrande|5": { "acc": 0.7790055248618785, "acc_stderr": 0.01166122363764341 }, "harness|gsm8k|5": { "acc": 0.40181956027293403, "acc_stderr": 0.01350435778749403 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_tourist800__mistral_2X7b
[ "region:us" ]
2024-01-28T18:45:57+00:00
{"pretty_name": "Evaluation run of tourist800/mistral_2X7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [tourist800/mistral_2X7b](https://huggingface.co/tourist800/mistral_2X7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tourist800__mistral_2X7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T18:43:40.962065](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__mistral_2X7b/blob/main/results_2024-01-28T18-43-40.962065.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6111388777526852,\n \"acc_stderr\": 0.03287383799644916,\n \"acc_norm\": 0.6159662135212005,\n \"acc_norm_stderr\": 0.03353760847602086,\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5207873423930568,\n \"mc2_stderr\": 0.0153889376471881\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221004,\n \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.014077223108470137\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6421031666998606,\n \"acc_stderr\": 0.004784018497679814,\n \"acc_norm\": 0.8376817367058355,\n \"acc_norm_stderr\": 0.0036798891253998155\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936337,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936337\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165897,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165897\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n \"acc_stderr\": 0.012680037994097074,\n \"acc_norm\": 0.4406779661016949,\n \"acc_norm_stderr\": 0.012680037994097074\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505518,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505518\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n \"acc_stderr\": 0.034611994290400135,\n \"acc_norm\": 0.6019900497512438,\n \"acc_norm_stderr\": 0.034611994290400135\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5207873423930568,\n \"mc2_stderr\": 0.0153889376471881\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \"acc_stderr\": 0.01350435778749403\n }\n}\n```", "repo_url": "https://huggingface.co/tourist800/mistral_2X7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-40.962065.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["**/details_harness|winogrande|5_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T18-43-40.962065.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T18_43_40.962065", "path": ["results_2024-01-28T18-43-40.962065.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T18-43-40.962065.parquet"]}]}]}
2024-01-28T18:46:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of tourist800/mistral_2X7b Dataset automatically created during the evaluation run of model tourist800/mistral_2X7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T18:43:40.962065(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of tourist800/mistral_2X7b\n\n\n\nDataset automatically created during the evaluation run of model tourist800/mistral_2X7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:43:40.962065(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of tourist800/mistral_2X7b\n\n\n\nDataset automatically created during the evaluation run of model tourist800/mistral_2X7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T18:43:40.962065(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
3362866d13d72b1b7d2eefd1762f070dbd001bea
hihihi
constantinewang/census
[ "region:us" ]
2024-01-28T19:00:56+00:00
{}
2024-01-28T19:02:27+00:00
[]
[]
TAGS #region-us
hihihi
[]
[ "TAGS\n#region-us \n" ]
9bad2c9df651e10c0b2ee6ae2c93d6b960f42275
# Dataset Card for Evaluation run of paulilioaica/Collin-7B-dare <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [paulilioaica/Collin-7B-dare](https://huggingface.co/paulilioaica/Collin-7B-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_paulilioaica__Collin-7B-dare", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T19:04:00.761394](https://huggingface.co/datasets/open-llm-leaderboard/details_paulilioaica__Collin-7B-dare/blob/main/results_2024-01-28T19-04-00.761394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5209951850139622, "acc_stderr": 0.034255506033106245, "acc_norm": 0.5260738190060968, "acc_norm_stderr": 0.035005434358067744, "mc1": 0.5177478580171359, "mc1_stderr": 0.01749247084307536, "mc2": 0.6520234737027234, "mc2_stderr": 0.015553743495889045 }, "harness|arc:challenge|25": { "acc": 0.6151877133105802, "acc_stderr": 0.014218371065251104, "acc_norm": 0.658703071672355, "acc_norm_stderr": 0.013855831287497726 }, "harness|hellaswag|10": { "acc": 0.6176060545708026, "acc_stderr": 0.004849788423944363, "acc_norm": 0.8207528380800637, "acc_norm_stderr": 0.003827752572770012 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5921052631578947, "acc_stderr": 0.039993097127774734, "acc_norm": 0.5921052631578947, "acc_norm_stderr": 0.039993097127774734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.569811320754717, "acc_stderr": 0.030471445867183238, "acc_norm": 0.569811320754717, "acc_norm_stderr": 0.030471445867183238 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.041014055198424264, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.041014055198424264 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.42127659574468085, "acc_stderr": 0.032278345101462665, "acc_norm": 0.42127659574468085, "acc_norm_stderr": 0.032278345101462665 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070434, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4689655172413793, "acc_stderr": 0.04158632762097828, "acc_norm": 0.4689655172413793, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.335978835978836, "acc_stderr": 0.02432631052914913, "acc_norm": 0.335978835978836, "acc_norm_stderr": 0.02432631052914913 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.04134913018303316, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.04134913018303316 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.4774193548387097, "acc_stderr": 0.02841498501970786, "acc_norm": 0.4774193548387097, "acc_norm_stderr": 0.02841498501970786 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3497536945812808, "acc_stderr": 0.03355400904969566, "acc_norm": 0.3497536945812808, "acc_norm_stderr": 0.03355400904969566 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5212121212121212, "acc_stderr": 0.03900828913737301, "acc_norm": 0.5212121212121212, "acc_norm_stderr": 0.03900828913737301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6717171717171717, "acc_stderr": 0.03345678422756775, "acc_norm": 0.6717171717171717, "acc_norm_stderr": 0.03345678422756775 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7823834196891192, "acc_stderr": 0.029778663037752954, "acc_norm": 0.7823834196891192, "acc_norm_stderr": 0.029778663037752954 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4948717948717949, "acc_stderr": 0.025349672906838653, "acc_norm": 0.4948717948717949, "acc_norm_stderr": 0.025349672906838653 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2814814814814815, "acc_stderr": 0.027420019350945284, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.027420019350945284 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5, "acc_stderr": 0.032478490123081544, "acc_norm": 0.5, "acc_norm_stderr": 0.032478490123081544 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2847682119205298, "acc_stderr": 0.03684881521389023, "acc_norm": 0.2847682119205298, "acc_norm_stderr": 0.03684881521389023 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7137614678899082, "acc_stderr": 0.019379436628920003, "acc_norm": 0.7137614678899082, "acc_norm_stderr": 0.019379436628920003 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.03256850570293647, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.03256850570293647 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5441176470588235, "acc_stderr": 0.03495624522015476, "acc_norm": 0.5441176470588235, "acc_norm_stderr": 0.03495624522015476 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6877637130801688, "acc_stderr": 0.030165137867847004, "acc_norm": 0.6877637130801688, "acc_norm_stderr": 0.030165137867847004 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6188340807174888, "acc_stderr": 0.03259625118416828, "acc_norm": 0.6188340807174888, "acc_norm_stderr": 0.03259625118416828 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6030534351145038, "acc_stderr": 0.04291135671009225, "acc_norm": 0.6030534351145038, "acc_norm_stderr": 0.04291135671009225 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.039849796533028725, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.039849796533028725 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6481481481481481, "acc_stderr": 0.04616631111801713, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.04616631111801713 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5828220858895705, "acc_stderr": 0.0387410285981808, "acc_norm": 0.5828220858895705, "acc_norm_stderr": 0.0387410285981808 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326468, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326468 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7692307692307693, "acc_stderr": 0.027601921381417618, "acc_norm": 0.7692307692307693, "acc_norm_stderr": 0.027601921381417618 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6947637292464879, "acc_stderr": 0.016467711947635123, "acc_norm": 0.6947637292464879, "acc_norm_stderr": 0.016467711947635123 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5491329479768786, "acc_stderr": 0.026788811931562753, "acc_norm": 0.5491329479768786, "acc_norm_stderr": 0.026788811931562753 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3553072625698324, "acc_stderr": 0.016006989934803182, "acc_norm": 0.3553072625698324, "acc_norm_stderr": 0.016006989934803182 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5620915032679739, "acc_stderr": 0.02840830202033269, "acc_norm": 0.5620915032679739, "acc_norm_stderr": 0.02840830202033269 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5594855305466238, "acc_stderr": 0.028196400574197422, "acc_norm": 0.5594855305466238, "acc_norm_stderr": 0.028196400574197422 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.558641975308642, "acc_stderr": 0.02762873715566878, "acc_norm": 0.558641975308642, "acc_norm_stderr": 0.02762873715566878 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.02883892147125145, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.02883892147125145 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.3741851368970013, "acc_stderr": 0.01235933561817206, "acc_norm": 0.3741851368970013, "acc_norm_stderr": 0.01235933561817206 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.44485294117647056, "acc_stderr": 0.03018753206032939, "acc_norm": 0.44485294117647056, "acc_norm_stderr": 0.03018753206032939 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5147058823529411, "acc_stderr": 0.020219083895133924, "acc_norm": 0.5147058823529411, "acc_norm_stderr": 0.020219083895133924 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6, "acc_stderr": 0.03136250240935893, "acc_norm": 0.6, "acc_norm_stderr": 0.03136250240935893 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5024875621890548, "acc_stderr": 0.03535490150137289, "acc_norm": 0.5024875621890548, "acc_norm_stderr": 0.03535490150137289 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-virology|5": { "acc": 0.42168674698795183, "acc_stderr": 0.03844453181770917, "acc_norm": 0.42168674698795183, "acc_norm_stderr": 0.03844453181770917 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7426900584795322, "acc_stderr": 0.03352799844161865, "acc_norm": 0.7426900584795322, "acc_norm_stderr": 0.03352799844161865 }, "harness|truthfulqa:mc|0": { "mc1": 0.5177478580171359, "mc1_stderr": 0.01749247084307536, "mc2": 0.6520234737027234, "mc2_stderr": 0.015553743495889045 }, "harness|winogrande|5": { "acc": 0.7790055248618785, "acc_stderr": 0.01166122363764341 }, "harness|gsm8k|5": { "acc": 0.2100075815011372, "acc_stderr": 0.011219441626913252 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_paulilioaica__Collin-7B-dare
[ "region:us" ]
2024-01-28T19:06:31+00:00
{"pretty_name": "Evaluation run of paulilioaica/Collin-7B-dare", "dataset_summary": "Dataset automatically created during the evaluation run of model [paulilioaica/Collin-7B-dare](https://huggingface.co/paulilioaica/Collin-7B-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulilioaica__Collin-7B-dare\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T19:04:00.761394](https://huggingface.co/datasets/open-llm-leaderboard/details_paulilioaica__Collin-7B-dare/blob/main/results_2024-01-28T19-04-00.761394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5209951850139622,\n \"acc_stderr\": 0.034255506033106245,\n \"acc_norm\": 0.5260738190060968,\n \"acc_norm_stderr\": 0.035005434358067744,\n \"mc1\": 0.5177478580171359,\n \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.6520234737027234,\n \"mc2_stderr\": 0.015553743495889045\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251104,\n \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497726\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6176060545708026,\n \"acc_stderr\": 0.004849788423944363,\n \"acc_norm\": 0.8207528380800637,\n \"acc_norm_stderr\": 0.003827752572770012\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.041014055198424264,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.041014055198424264\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.032278345101462665,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.032278345101462665\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.335978835978836,\n \"acc_stderr\": 0.02432631052914913,\n \"acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.02432631052914913\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4774193548387097,\n \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.4774193548387097,\n \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5212121212121212,\n \"acc_stderr\": 0.03900828913737301,\n \"acc_norm\": 0.5212121212121212,\n \"acc_norm_stderr\": 0.03900828913737301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756775,\n \"acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756775\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.025349672906838653,\n \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.025349672906838653\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.032478490123081544,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.032478490123081544\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7137614678899082,\n \"acc_stderr\": 0.019379436628920003,\n \"acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.019379436628920003\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.03256850570293647,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.03256850570293647\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015476,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015476\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847004,\n \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847004\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.0387410285981808,\n \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.0387410285981808\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.027601921381417618,\n \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.027601921381417618\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6947637292464879,\n \"acc_stderr\": 0.016467711947635123,\n \"acc_norm\": 0.6947637292464879,\n \"acc_norm_stderr\": 0.016467711947635123\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.026788811931562753,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.026788811931562753\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n \"acc_stderr\": 0.016006989934803182,\n \"acc_norm\": 0.3553072625698324,\n \"acc_norm_stderr\": 0.016006989934803182\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n \"acc_stderr\": 0.028196400574197422,\n \"acc_norm\": 0.5594855305466238,\n \"acc_norm_stderr\": 0.028196400574197422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.02762873715566878,\n \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.02762873715566878\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125145,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125145\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3741851368970013,\n \"acc_stderr\": 0.01235933561817206,\n \"acc_norm\": 0.3741851368970013,\n \"acc_norm_stderr\": 0.01235933561817206\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03136250240935893,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n \"acc_stderr\": 0.03535490150137289,\n \"acc_norm\": 0.5024875621890548,\n \"acc_norm_stderr\": 0.03535490150137289\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5177478580171359,\n \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.6520234737027234,\n \"mc2_stderr\": 0.015553743495889045\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2100075815011372,\n \"acc_stderr\": 0.011219441626913252\n }\n}\n```", "repo_url": "https://huggingface.co/paulilioaica/Collin-7B-dare", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|arc:challenge|25_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|gsm8k|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hellaswag|10_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T19-04-00.761394.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["**/details_harness|winogrande|5_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T19-04-00.761394.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T19_04_00.761394", "path": ["results_2024-01-28T19-04-00.761394.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T19-04-00.761394.parquet"]}]}]}
2024-01-28T19:06:53+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of paulilioaica/Collin-7B-dare Dataset automatically created during the evaluation run of model paulilioaica/Collin-7B-dare on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T19:04:00.761394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of paulilioaica/Collin-7B-dare\n\n\n\nDataset automatically created during the evaluation run of model paulilioaica/Collin-7B-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T19:04:00.761394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of paulilioaica/Collin-7B-dare\n\n\n\nDataset automatically created during the evaluation run of model paulilioaica/Collin-7B-dare on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T19:04:00.761394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7e1e0a592c9186fc12f34dbb3497650e88ff3201
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v17.1-32k <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v17.1-32k](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v17.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T05:26:38.724116](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k/blob/main/results_2024-02-12T05-26-38.724116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5820294540126106, "acc_stderr": 0.03338687908537487, "acc_norm": 0.5857871578259474, "acc_norm_stderr": 0.03406677498994728, "mc1": 0.38922888616891066, "mc1_stderr": 0.017068552680690328, "mc2": 0.5606214128788316, "mc2_stderr": 0.015225071278712598 }, "harness|arc:challenge|25": { "acc": 0.515358361774744, "acc_stderr": 0.014604496129394908, "acc_norm": 0.5554607508532423, "acc_norm_stderr": 0.014521226405627084 }, "harness|hellaswag|10": { "acc": 0.5853415654252141, "acc_stderr": 0.004916561213591288, "acc_norm": 0.7795259908384784, "acc_norm_stderr": 0.004137190475425526 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5407407407407407, "acc_stderr": 0.04304979692464243, "acc_norm": 0.5407407407407407, "acc_norm_stderr": 0.04304979692464243 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6264150943396226, "acc_stderr": 0.029773082713319875, "acc_norm": 0.6264150943396226, "acc_norm_stderr": 0.029773082713319875 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6180555555555556, "acc_stderr": 0.040629907841466674, "acc_norm": 0.6180555555555556, "acc_norm_stderr": 0.040629907841466674 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383886, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383886 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5063829787234042, "acc_stderr": 0.032683358999363366, "acc_norm": 0.5063829787234042, "acc_norm_stderr": 0.032683358999363366 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.045796394220704334, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.045796394220704334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482758, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36772486772486773, "acc_stderr": 0.02483383982556242, "acc_norm": 0.36772486772486773, "acc_norm_stderr": 0.02483383982556242 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7322580645161291, "acc_stderr": 0.025189006660212385, "acc_norm": 0.7322580645161291, "acc_norm_stderr": 0.025189006660212385 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175007, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175007 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7777777777777778, "acc_stderr": 0.029620227874790486, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.029620227874790486 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5615384615384615, "acc_stderr": 0.025158266016868585, "acc_norm": 0.5615384615384615, "acc_norm_stderr": 0.025158266016868585 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2518518518518518, "acc_stderr": 0.026466117538959916, "acc_norm": 0.2518518518518518, "acc_norm_stderr": 0.026466117538959916 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6050420168067226, "acc_stderr": 0.03175367846096625, "acc_norm": 0.6050420168067226, "acc_norm_stderr": 0.03175367846096625 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526732, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7761467889908257, "acc_stderr": 0.017871217767790215, "acc_norm": 0.7761467889908257, "acc_norm_stderr": 0.017871217767790215 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39351851851851855, "acc_stderr": 0.03331747876370312, "acc_norm": 0.39351851851851855, "acc_norm_stderr": 0.03331747876370312 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.030587591351604246, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.03210062154134987, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.03210062154134987 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.040103589424622034, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.042059539338841226, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.042059539338841226 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6932515337423313, "acc_stderr": 0.03623089915724146, "acc_norm": 0.6932515337423313, "acc_norm_stderr": 0.03623089915724146 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7535121328224776, "acc_stderr": 0.015411308769686933, "acc_norm": 0.7535121328224776, "acc_norm_stderr": 0.015411308769686933 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6358381502890174, "acc_stderr": 0.02590663263101613, "acc_norm": 0.6358381502890174, "acc_norm_stderr": 0.02590663263101613 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.27262569832402234, "acc_stderr": 0.01489339173524962, "acc_norm": 0.27262569832402234, "acc_norm_stderr": 0.01489339173524962 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6764705882352942, "acc_stderr": 0.026787453111906504, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.026787453111906504 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.662379421221865, "acc_stderr": 0.026858825879488544, "acc_norm": 0.662379421221865, "acc_norm_stderr": 0.026858825879488544 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6327160493827161, "acc_stderr": 0.02682280175950789, "acc_norm": 0.6327160493827161, "acc_norm_stderr": 0.02682280175950789 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.029583452036284066, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.029583452036284066 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4302477183833116, "acc_stderr": 0.012645361435115226, "acc_norm": 0.4302477183833116, "acc_norm_stderr": 0.012645361435115226 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5367647058823529, "acc_stderr": 0.03029061918048569, "acc_norm": 0.5367647058823529, "acc_norm_stderr": 0.03029061918048569 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5849673202614379, "acc_stderr": 0.019933627776857425, "acc_norm": 0.5849673202614379, "acc_norm_stderr": 0.019933627776857425 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6816326530612244, "acc_stderr": 0.029822533793982066, "acc_norm": 0.6816326530612244, "acc_norm_stderr": 0.029822533793982066 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8009950248756219, "acc_stderr": 0.028231365092758406, "acc_norm": 0.8009950248756219, "acc_norm_stderr": 0.028231365092758406 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.03891364495835821, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.03891364495835821 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7251461988304093, "acc_stderr": 0.03424042924691583, "acc_norm": 0.7251461988304093, "acc_norm_stderr": 0.03424042924691583 }, "harness|truthfulqa:mc|0": { "mc1": 0.38922888616891066, "mc1_stderr": 0.017068552680690328, "mc2": 0.5606214128788316, "mc2_stderr": 0.015225071278712598 }, "harness|winogrande|5": { "acc": 0.749802683504341, "acc_stderr": 0.01217300964244915 }, "harness|gsm8k|5": { "acc": 0.4268385140257771, "acc_stderr": 0.013624249696595226 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k
[ "region:us" ]
2024-01-28T19:12:22+00:00
{"pretty_name": "Evaluation run of OpenBuddy/openbuddy-mistral-7b-v17.1-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v17.1-32k](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v17.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T05:26:38.724116](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k/blob/main/results_2024-02-12T05-26-38.724116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5820294540126106,\n \"acc_stderr\": 0.03338687908537487,\n \"acc_norm\": 0.5857871578259474,\n \"acc_norm_stderr\": 0.03406677498994728,\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5606214128788316,\n \"mc2_stderr\": 0.015225071278712598\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.014604496129394908,\n \"acc_norm\": 0.5554607508532423,\n \"acc_norm_stderr\": 0.014521226405627084\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5853415654252141,\n \"acc_stderr\": 0.004916561213591288,\n \"acc_norm\": 0.7795259908384784,\n \"acc_norm_stderr\": 0.004137190475425526\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383886,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383886\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.02483383982556242,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.02483383982556242\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868585,\n \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868585\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n \"acc_stderr\": 0.015411308769686933,\n \"acc_norm\": 0.7535121328224776,\n \"acc_norm_stderr\": 0.015411308769686933\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.02590663263101613,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.02590663263101613\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.01489339173524962,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.01489339173524962\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.02682280175950789,\n \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.02682280175950789\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n \"acc_stderr\": 0.012645361435115226,\n \"acc_norm\": 0.4302477183833116,\n \"acc_norm_stderr\": 0.012645361435115226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.019933627776857425,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.019933627776857425\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5606214128788316,\n \"mc2_stderr\": 0.015225071278712598\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.01217300964244915\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4268385140257771,\n \"acc_stderr\": 0.013624249696595226\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v17.1-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|arc:challenge|25_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|arc:challenge|25_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|gsm8k|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|gsm8k|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hellaswag|10_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hellaswag|10_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T19-10-05.948412.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["**/details_harness|winogrande|5_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["**/details_harness|winogrande|5_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T05-26-38.724116.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T19_10_05.948412", "path": ["results_2024-01-28T19-10-05.948412.parquet"]}, {"split": "2024_02_12T05_26_38.724116", "path": ["results_2024-02-12T05-26-38.724116.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T05-26-38.724116.parquet"]}]}]}
2024-02-12T05:29:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v17.1-32k Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mistral-7b-v17.1-32k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T05:26:38.724116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v17.1-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mistral-7b-v17.1-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T05:26:38.724116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v17.1-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mistral-7b-v17.1-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T05:26:38.724116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5f5bd92427ceee846503f1f070adca1a8c86d05d
## Python Copilot Image Training using Class Knowledge Graphs This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset. ### Details Each row contains a png file in the **dbytes** column. - Rows: 312836 - Size: 294.1 GB - Data type: png - Format: Knowledge graph using NetworkX with alpaca text box ### Schema The png is in the **dbytes** column: ``` { "dbytes": "binary", "dbytes_len": "int64", "dbytes_mb": "float64", "filename": "string", "path": "string", "repo": "string", "type": "string" } ``` ### How to use the dataset ```python from datasets import load_dataset ds = load_dataset("matlok/python-image-copilot-training-using-class-knowledge-graphs-2024-01-27", data_dir="files") ```
matlok/python-image-copilot-training-using-class-knowledge-graphs-2024-01-27
[ "task_categories:text-to-image", "task_categories:image-to-image", "task_categories:question-answering", "task_ids:parsing", "size_categories:100K<n<1M", "license:other", "python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "class", "classes", "region:us" ]
2024-01-28T19:32:15+00:00
{"license": ["other"], "size_categories": ["100K<n<1M"], "task_categories": ["text-to-image", "image-to-image", "question-answering"], "task_ids": ["parsing"], "pretty_name": "python copilot image training using class knowledge graphs updated 2024-01-27", "dataset_info": [{"config_name": "v1_transformers_examples_pytorch", "splits": [{"name": "v1_transformers_examples_pytorch"}]}, {"config_name": "v2_pytorch_torch_distributed_fsdp", "splits": [{"name": "v2_pytorch_torch_distributed_fsdp"}]}, {"config_name": "v3_deepspeed_deepspeed_runtime", "splits": [{"name": "v3_deepspeed_deepspeed_runtime"}]}, {"config_name": "v4_fused_gelu_testing_src", "splits": [{"name": "v4_fused_gelu_testing_src"}]}, {"config_name": "v5_unsloth_unsloth_models", "splits": [{"name": "v5_unsloth_unsloth_models"}]}, {"config_name": "v6_blip_models", "splits": [{"name": "v6_blip_models"}]}, {"config_name": "v7_text_generation_inference_server_text_generation_server", "splits": [{"name": "v7_text_generation_inference_server_text_generation_server"}]}, {"config_name": "v8_spark_python_pyspark_pandas_plot", "splits": [{"name": "v8_spark_python_pyspark_pandas_plot"}]}, {"config_name": "view_schema", "splits": [{"name": "view_schema"}]}], "configs": [{"config_name": "v1_transformers_examples_pytorch", "data_files": [{"split": "v1_transformers_examples_pytorch", "path": "train/train-0002-transformers-examples-pytorch.parquet"}]}, {"config_name": "v2_pytorch_torch_distributed_fsdp", "data_files": [{"split": "v2_pytorch_torch_distributed_fsdp", "path": "train/train-0003-pytorch-torch-distributed-fsdp.parquet"}]}, {"config_name": "v3_deepspeed_deepspeed_runtime", "data_files": [{"split": "v3_deepspeed_deepspeed_runtime", "path": "train/train-0004-deepspeed-deepspeed-runtime.parquet"}]}, {"config_name": "v4_fused_gelu_testing_src", "data_files": [{"split": "v4_fused_gelu_testing_srck", "path": "train/train-0005-fused-gelu-testing-src.parquet"}]}, {"config_name": "v5_unsloth_unsloth_models", "data_files": [{"split": "v5_unsloth_unsloth_models", "path": "train/train-0006-unsloth-unsloth-models.parquet"}]}, {"config_name": "v6_blip_models", "data_files": [{"split": "v6_blip_models", "path": "train/train-0007-blip-models.parquet"}]}, {"config_name": "v7_text_generation_inference_server_text_generation_server", "data_files": [{"split": "v7_text_generation_inference_server_text_generation_server", "path": "train/train-0008-text-generation-inference-server-text_generation_server.parquet"}]}, {"config_name": "v8_spark_python_pyspark_pandas_plot", "data_files": [{"split": "v8_spark_python_pyspark_pandas_plot", "path": "train/train-0009-spark-python-pyspark-pandas-plot.parquet"}]}, {"config_name": "view_schema", "data_files": [{"split": "view_schema", "path": "files/lok-python-copilot-image.class-v1_00003555.parquet"}]}], "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "class", "classes"]}
2024-01-29T15:22:13+00:00
[]
[]
TAGS #task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #region-us
## Python Copilot Image Training using Class Knowledge Graphs This dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset. ### Details Each row contains a png file in the dbytes column. - Rows: 312836 - Size: 294.1 GB - Data type: png - Format: Knowledge graph using NetworkX with alpaca text box ### Schema The png is in the dbytes column: ### How to use the dataset
[ "## Python Copilot Image Training using Class Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.", "### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 312836\n- Size: 294.1 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box", "### Schema\n\nThe png is in the dbytes column:", "### How to use the dataset" ]
[ "TAGS\n#task_categories-text-to-image #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-100K<n<1M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #region-us \n", "## Python Copilot Image Training using Class Knowledge Graphs\n\nThis dataset is a subset of the matlok python copilot datasets. Please refer to the Multimodal Python Copilot Training Overview for more details on how to use this dataset.", "### Details\n\nEach row contains a png file in the dbytes column.\n\n- Rows: 312836\n- Size: 294.1 GB\n- Data type: png\n- Format: Knowledge graph using NetworkX with alpaca text box", "### Schema\n\nThe png is in the dbytes column:", "### How to use the dataset" ]
e33e47552a021cd2055e83b099b0fd1d2c6f7347
# Dataset Card for Evaluation run of migtissera/Tess-10.7B-v1.5b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [migtissera/Tess-10.7B-v1.5b](https://huggingface.co/migtissera/Tess-10.7B-v1.5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T19:33:09.585454](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5b/blob/main/results_2024-01-28T19-33-09.585454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6623790658470444, "acc_stderr": 0.03145433935990498, "acc_norm": 0.6654734598510913, "acc_norm_stderr": 0.0320818568417205, "mc1": 0.31946144430844553, "mc1_stderr": 0.0163226441829605, "mc2": 0.4738041625665458, "mc2_stderr": 0.0145592749388437 }, "harness|arc:challenge|25": { "acc": 0.5998293515358362, "acc_stderr": 0.014317197787809169, "acc_norm": 0.6535836177474402, "acc_norm_stderr": 0.013905011180063232 }, "harness|hellaswag|10": { "acc": 0.6595299741087433, "acc_stderr": 0.004728988167338548, "acc_norm": 0.8533160724955188, "acc_norm_stderr": 0.003530675014892316 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.756578947368421, "acc_stderr": 0.034923496688842384, "acc_norm": 0.756578947368421, "acc_norm_stderr": 0.034923496688842384 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6878612716763006, "acc_stderr": 0.035331333893236574, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.035331333893236574 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.04685473041907789, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.04685473041907789 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6275862068965518, "acc_stderr": 0.04028731532947559, "acc_norm": 0.6275862068965518, "acc_norm_stderr": 0.04028731532947559 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4365079365079365, "acc_stderr": 0.025542846817400485, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.025542846817400485 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768176, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8096774193548387, "acc_stderr": 0.022331707611823078, "acc_norm": 0.8096774193548387, "acc_norm_stderr": 0.022331707611823078 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03011768892950357, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03011768892950357 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8585858585858586, "acc_stderr": 0.024825909793343343, "acc_norm": 0.8585858585858586, "acc_norm_stderr": 0.024825909793343343 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9326424870466321, "acc_stderr": 0.018088393839078912, "acc_norm": 0.9326424870466321, "acc_norm_stderr": 0.018088393839078912 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402538, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402538 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.02944316932303154, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.02944316932303154 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7100840336134454, "acc_stderr": 0.029472485833136094, "acc_norm": 0.7100840336134454, "acc_norm_stderr": 0.029472485833136094 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.038227469376587525, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.038227469376587525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8623853211009175, "acc_stderr": 0.014770105878649405, "acc_norm": 0.8623853211009175, "acc_norm_stderr": 0.014770105878649405 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6018518518518519, "acc_stderr": 0.033384734032074016, "acc_norm": 0.6018518518518519, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8649789029535865, "acc_stderr": 0.022245776632003694, "acc_norm": 0.8649789029535865, "acc_norm_stderr": 0.022245776632003694 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.03076935200822915, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.03076935200822915 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709696, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709696 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.03226219377286775, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.03226219377286775 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.49107142857142855, "acc_stderr": 0.04745033255489123, "acc_norm": 0.49107142857142855, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573973, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573973 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9017094017094017, "acc_stderr": 0.019503444900757567, "acc_norm": 0.9017094017094017, "acc_norm_stderr": 0.019503444900757567 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8250319284802043, "acc_stderr": 0.013586619219903347, "acc_norm": 0.8250319284802043, "acc_norm_stderr": 0.013586619219903347 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7514450867052023, "acc_stderr": 0.023267528432100174, "acc_norm": 0.7514450867052023, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39776536312849164, "acc_stderr": 0.016369204971262985, "acc_norm": 0.39776536312849164, "acc_norm_stderr": 0.016369204971262985 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7777777777777778, "acc_stderr": 0.023805186524888135, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.023805186524888135 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885142, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885142 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7592592592592593, "acc_stderr": 0.023788583551658537, "acc_norm": 0.7592592592592593, "acc_norm_stderr": 0.023788583551658537 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4954367666232073, "acc_stderr": 0.012769704263117519, "acc_norm": 0.4954367666232073, "acc_norm_stderr": 0.012769704263117519 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7536764705882353, "acc_stderr": 0.02617343857052, "acc_norm": 0.7536764705882353, "acc_norm_stderr": 0.02617343857052 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.684640522875817, "acc_stderr": 0.018798086284886887, "acc_norm": 0.684640522875817, "acc_norm_stderr": 0.018798086284886887 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7673469387755102, "acc_stderr": 0.02704925791589618, "acc_norm": 0.7673469387755102, "acc_norm_stderr": 0.02704925791589618 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8756218905472637, "acc_stderr": 0.023335401790166327, "acc_norm": 0.8756218905472637, "acc_norm_stderr": 0.023335401790166327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.03015113445777634, "acc_norm": 0.9, "acc_norm_stderr": 0.03015113445777634 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.038695433234721015, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.038695433234721015 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.028782108105401705, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.028782108105401705 }, "harness|truthfulqa:mc|0": { "mc1": 0.31946144430844553, "mc1_stderr": 0.0163226441829605, "mc2": 0.4738041625665458, "mc2_stderr": 0.0145592749388437 }, "harness|winogrande|5": { "acc": 0.8279400157853196, "acc_stderr": 0.010607731615247001 }, "harness|gsm8k|5": { "acc": 0.5617892342683851, "acc_stderr": 0.013666915917255069 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5b
[ "region:us" ]
2024-01-28T19:35:27+00:00
{"pretty_name": "Evaluation run of migtissera/Tess-10.7B-v1.5b", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-10.7B-v1.5b](https://huggingface.co/migtissera/Tess-10.7B-v1.5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T19:33:09.585454](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5b/blob/main/results_2024-01-28T19-33-09.585454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6623790658470444,\n \"acc_stderr\": 0.03145433935990498,\n \"acc_norm\": 0.6654734598510913,\n \"acc_norm_stderr\": 0.0320818568417205,\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.4738041625665458,\n \"mc2_stderr\": 0.0145592749388437\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809169,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063232\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6595299741087433,\n \"acc_stderr\": 0.004728988167338548,\n \"acc_norm\": 0.8533160724955188,\n \"acc_norm_stderr\": 0.003530675014892316\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947559,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947559\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400485,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400485\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343343,\n \"acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343343\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136094,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136094\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649405,\n \"acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649405\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39776536312849164,\n \"acc_stderr\": 0.016369204971262985,\n \"acc_norm\": 0.39776536312849164,\n \"acc_norm_stderr\": 0.016369204971262985\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888135,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888135\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n \"acc_stderr\": 0.012769704263117519,\n \"acc_norm\": 0.4954367666232073,\n \"acc_norm_stderr\": 0.012769704263117519\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.4738041625665458,\n \"mc2_stderr\": 0.0145592749388437\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247001\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5617892342683851,\n \"acc_stderr\": 0.013666915917255069\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-10.7B-v1.5b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|arc:challenge|25_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|gsm8k|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hellaswag|10_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T19-33-09.585454.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["**/details_harness|winogrande|5_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T19-33-09.585454.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T19_33_09.585454", "path": ["results_2024-01-28T19-33-09.585454.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T19-33-09.585454.parquet"]}]}]}
2024-01-28T19:35:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of migtissera/Tess-10.7B-v1.5b Dataset automatically created during the evaluation run of model migtissera/Tess-10.7B-v1.5b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T19:33:09.585454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of migtissera/Tess-10.7B-v1.5b\n\n\n\nDataset automatically created during the evaluation run of model migtissera/Tess-10.7B-v1.5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T19:33:09.585454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of migtissera/Tess-10.7B-v1.5b\n\n\n\nDataset automatically created during the evaluation run of model migtissera/Tess-10.7B-v1.5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T19:33:09.585454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
e8224a2261b52e7b8dde5535ca2a1e6e5f7694ac
This dataset consists of processed and separated data for producing and validating a model using california housing data
Ryan-Pupia/CS482-HousingDataSet
[ "language:en", "license:mit", "region:us" ]
2024-01-28T20:13:34+00:00
{"language": ["en"], "license": "mit", "pretty_name": "Pre-Processed Housing Data", "dataset_info": {"features": [{"name": "log_stand__housing_median_age", "dtype": "float64"}, {"name": "log_stand__total_rooms", "dtype": "float64"}, {"name": "log_stand__total_bedrooms", "dtype": "float64"}, {"name": "log_stand__population", "dtype": "float64"}, {"name": "log_stand__households", "dtype": "float64"}, {"name": "log_stand__median_income", "dtype": "float64"}, {"name": "log_stand__median_house_value", "dtype": "float64"}, {"name": "encode__ocean_proximity_<1H OCEAN", "dtype": "float64"}, {"name": "encode__ocean_proximity_INLAND", "dtype": "float64"}, {"name": "encode__ocean_proximity_ISLAND", "dtype": "float64"}, {"name": "encode__ocean_proximity_NEAR BAY", "dtype": "float64"}, {"name": "encode__ocean_proximity_NEAR OCEAN", "dtype": "float64"}, {"name": "scale__longitude", "dtype": "float64"}, {"name": "scale__latitude", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 1648864, "num_examples": 14722}, {"name": "test", "num_bytes": 412272, "num_examples": 3681}], "download_size": 1130408, "dataset_size": 2061136}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-01-28T22:26:31+00:00
[]
[ "en" ]
TAGS #language-English #license-mit #region-us
This dataset consists of processed and separated data for producing and validating a model using california housing data
[]
[ "TAGS\n#language-English #license-mit #region-us \n" ]
2aa8b662913b59229e7e564a74176889ae0d24fe
# Dataset Card for Evaluation run of SC44/Mistral-7B-private-sia <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SC44/Mistral-7B-private-sia](https://huggingface.co/SC44/Mistral-7B-private-sia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SC44__Mistral-7B-private-sia", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T20:47:57.957299](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-sia/blob/main/results_2024-01-28T20-47-57.957299.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6502731360719932, "acc_stderr": 0.032115797187921055, "acc_norm": 0.6500477380685269, "acc_norm_stderr": 0.03278068219550418, "mc1": 0.5667074663402693, "mc1_stderr": 0.01734702445010747, "mc2": 0.7243565223330868, "mc2_stderr": 0.014798861590895166 }, "harness|arc:challenge|25": { "acc": 0.6936860068259386, "acc_stderr": 0.013470584417276514, "acc_norm": 0.7252559726962458, "acc_norm_stderr": 0.013044617212771227 }, "harness|hellaswag|10": { "acc": 0.7276438956383191, "acc_stderr": 0.004442623590846324, "acc_norm": 0.8907588129854611, "acc_norm_stderr": 0.0031130406065401324 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720386, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720386 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.0373852067611967, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.0373852067611967 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.02845015479411864, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.02845015479411864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.02540255550326091, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.02540255550326091 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494563, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494563 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.02150024957603348, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.02150024957603348 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.029116617606083008, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.029116617606083008 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507338, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507338 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926917, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926917 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7633587786259542, "acc_stderr": 0.03727673575596913, "acc_norm": 0.7633587786259542, "acc_norm_stderr": 0.03727673575596913 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098823, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098823 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8275862068965517, "acc_stderr": 0.013507943909371803, "acc_norm": 0.8275862068965517, "acc_norm_stderr": 0.013507943909371803 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3776536312849162, "acc_stderr": 0.016214148752136632, "acc_norm": 0.3776536312849162, "acc_norm_stderr": 0.016214148752136632 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.02582916327275748, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.02582916327275748 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.02389187954195961, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.02389187954195961 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4929078014184397, "acc_stderr": 0.02982449855912901, "acc_norm": 0.4929078014184397, "acc_norm_stderr": 0.02982449855912901 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4634941329856584, "acc_stderr": 0.012736153390214961, "acc_norm": 0.4634941329856584, "acc_norm_stderr": 0.012736153390214961 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.028332959514031208, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.028332959514031208 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.01890101532209309, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.01890101532209309 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5667074663402693, "mc1_stderr": 0.01734702445010747, "mc2": 0.7243565223330868, "mc2_stderr": 0.014798861590895166 }, "harness|winogrande|5": { "acc": 0.8413575374901342, "acc_stderr": 0.010267936243028198 }, "harness|gsm8k|5": { "acc": 0.6671721000758151, "acc_stderr": 0.012979892496598278 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SC44__Mistral-7B-private-sia
[ "region:us" ]
2024-01-28T20:50:16+00:00
{"pretty_name": "Evaluation run of SC44/Mistral-7B-private-sia", "dataset_summary": "Dataset automatically created during the evaluation run of model [SC44/Mistral-7B-private-sia](https://huggingface.co/SC44/Mistral-7B-private-sia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC44__Mistral-7B-private-sia\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T20:47:57.957299](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-sia/blob/main/results_2024-01-28T20-47-57.957299.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6502731360719932,\n \"acc_stderr\": 0.032115797187921055,\n \"acc_norm\": 0.6500477380685269,\n \"acc_norm_stderr\": 0.03278068219550418,\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.01734702445010747,\n \"mc2\": 0.7243565223330868,\n \"mc2_stderr\": 0.014798861590895166\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6936860068259386,\n \"acc_stderr\": 0.013470584417276514,\n \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7276438956383191,\n \"acc_stderr\": 0.004442623590846324,\n \"acc_norm\": 0.8907588129854611,\n \"acc_norm_stderr\": 0.0031130406065401324\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n \"acc_stderr\": 0.016214148752136632,\n \"acc_norm\": 0.3776536312849162,\n \"acc_norm_stderr\": 0.016214148752136632\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.01734702445010747,\n \"mc2\": 0.7243565223330868,\n \"mc2_stderr\": 0.014798861590895166\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028198\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \"acc_stderr\": 0.012979892496598278\n }\n}\n```", "repo_url": "https://huggingface.co/SC44/Mistral-7B-private-sia", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|arc:challenge|25_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|gsm8k|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hellaswag|10_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T20-47-57.957299.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["**/details_harness|winogrande|5_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T20-47-57.957299.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T20_47_57.957299", "path": ["results_2024-01-28T20-47-57.957299.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T20-47-57.957299.parquet"]}]}]}
2024-01-28T20:50:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SC44/Mistral-7B-private-sia Dataset automatically created during the evaluation run of model SC44/Mistral-7B-private-sia on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T20:47:57.957299(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SC44/Mistral-7B-private-sia\n\n\n\nDataset automatically created during the evaluation run of model SC44/Mistral-7B-private-sia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T20:47:57.957299(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SC44/Mistral-7B-private-sia\n\n\n\nDataset automatically created during the evaluation run of model SC44/Mistral-7B-private-sia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T20:47:57.957299(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6cb09a8c2b3ee1471993f9043efb9d6f06561973
# Dataset Card for "asante_twi_bible" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lagyamfi/asante_twi_bible
[ "task_categories:automatic-speech-recognition", "task_categories:text-to-speech", "size_categories:1B<n<10B", "language:tw", "license:cc-by-sa-3.0", "GhanaNLP", "Twi", "region:us" ]
2024-01-28T20:59:13+00:00
{"language": ["tw"], "license": "cc-by-sa-3.0", "size_categories": ["1B<n<10B"], "task_categories": ["automatic-speech-recognition", "text-to-speech"], "dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "transcript", "dtype": "string"}, {"name": "verse", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15721845939.576, "num_examples": 21348}, {"name": "test", "num_bytes": 49094317.0, "num_examples": 64}, {"name": "validation", "num_bytes": 183952529.0, "num_examples": 217}], "download_size": 15531540341, "dataset_size": 15954892785.576}, "tags": ["GhanaNLP", "Twi"]}
2024-01-29T14:39:07+00:00
[]
[ "tw" ]
TAGS #task_categories-automatic-speech-recognition #task_categories-text-to-speech #size_categories-1B<n<10B #language-Twi #license-cc-by-sa-3.0 #GhanaNLP #Twi #region-us
# Dataset Card for "asante_twi_bible" More Information needed
[ "# Dataset Card for \"asante_twi_bible\"\n\nMore Information needed" ]
[ "TAGS\n#task_categories-automatic-speech-recognition #task_categories-text-to-speech #size_categories-1B<n<10B #language-Twi #license-cc-by-sa-3.0 #GhanaNLP #Twi #region-us \n", "# Dataset Card for \"asante_twi_bible\"\n\nMore Information needed" ]
ad995e6e0dc468a3525437ca0bd4113bd2f7aa36
# Dataset Card for Evaluation run of SC44/Mistral-7B-private-oia <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SC44/Mistral-7B-private-oia](https://huggingface.co/SC44/Mistral-7B-private-oia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SC44__Mistral-7B-private-oia", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T20:57:18.847869](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-oia/blob/main/results_2024-01-28T20-57-18.847869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6481514121645634, "acc_stderr": 0.03222955295349873, "acc_norm": 0.6482463305308075, "acc_norm_stderr": 0.032893811368600714, "mc1": 0.5789473684210527, "mc1_stderr": 0.017283936248136473, "mc2": 0.7314870038039903, "mc2_stderr": 0.014661019064531787 }, "harness|arc:challenge|25": { "acc": 0.7013651877133106, "acc_stderr": 0.013374078615068747, "acc_norm": 0.7278156996587031, "acc_norm_stderr": 0.013006600406423702 }, "harness|hellaswag|10": { "acc": 0.7261501692889862, "acc_stderr": 0.004450214826707175, "acc_norm": 0.892352121091416, "acc_norm_stderr": 0.0030930175559380035 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337124, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337124 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.47058823529411764, "acc_stderr": 0.04966570903978529, "acc_norm": 0.47058823529411764, "acc_norm_stderr": 0.04966570903978529 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555498, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555498 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055266, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055266 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723302, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723302 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.030588697013783642, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.030588697013783642 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.01591955782997604, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.01591955782997604 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5185185185185185, "acc_stderr": 0.03407632093854051, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.03407632093854051 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02675640153807897, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02675640153807897 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179326, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179326 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.04688261722621504, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993464, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993464 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.02425790170532338, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.02425790170532338 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.37318435754189944, "acc_stderr": 0.016175692013381957, "acc_norm": 0.37318435754189944, "acc_norm_stderr": 0.016175692013381957 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6993464052287581, "acc_stderr": 0.026256053835718964, "acc_norm": 0.6993464052287581, "acc_norm_stderr": 0.026256053835718964 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.025311765975426122, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.025311765975426122 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7314814814814815, "acc_stderr": 0.024659685185967284, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.024659685185967284 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.029790719243829727, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.029790719243829727 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4726205997392438, "acc_stderr": 0.01275107578801506, "acc_norm": 0.4726205997392438, "acc_norm_stderr": 0.01275107578801506 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.0286619962023353, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.0286619962023353 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6813725490196079, "acc_stderr": 0.01885008469646872, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.01885008469646872 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.02650859065623327, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.02650859065623327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5789473684210527, "mc1_stderr": 0.017283936248136473, "mc2": 0.7314870038039903, "mc2_stderr": 0.014661019064531787 }, "harness|winogrande|5": { "acc": 0.8374112075769534, "acc_stderr": 0.010370455551343343 }, "harness|gsm8k|5": { "acc": 0.6459438968915845, "acc_stderr": 0.013172728385222574 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SC44__Mistral-7B-private-oia
[ "region:us" ]
2024-01-28T20:59:39+00:00
{"pretty_name": "Evaluation run of SC44/Mistral-7B-private-oia", "dataset_summary": "Dataset automatically created during the evaluation run of model [SC44/Mistral-7B-private-oia](https://huggingface.co/SC44/Mistral-7B-private-oia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC44__Mistral-7B-private-oia\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T20:57:18.847869](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-oia/blob/main/results_2024-01-28T20-57-18.847869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6481514121645634,\n \"acc_stderr\": 0.03222955295349873,\n \"acc_norm\": 0.6482463305308075,\n \"acc_norm_stderr\": 0.032893811368600714,\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136473,\n \"mc2\": 0.7314870038039903,\n \"mc2_stderr\": 0.014661019064531787\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068747,\n \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7261501692889862,\n \"acc_stderr\": 0.004450214826707175,\n \"acc_norm\": 0.892352121091416,\n \"acc_norm_stderr\": 0.0030930175559380035\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723302,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.030588697013783642,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.030588697013783642\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n \"acc_stderr\": 0.016175692013381957,\n \"acc_norm\": 0.37318435754189944,\n \"acc_norm_stderr\": 0.016175692013381957\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.01275107578801506,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.01275107578801506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136473,\n \"mc2\": 0.7314870038039903,\n \"mc2_stderr\": 0.014661019064531787\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343343\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6459438968915845,\n \"acc_stderr\": 0.013172728385222574\n }\n}\n```", "repo_url": "https://huggingface.co/SC44/Mistral-7B-private-oia", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|arc:challenge|25_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|gsm8k|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hellaswag|10_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T20-57-18.847869.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["**/details_harness|winogrande|5_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T20-57-18.847869.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T20_57_18.847869", "path": ["results_2024-01-28T20-57-18.847869.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T20-57-18.847869.parquet"]}]}]}
2024-01-28T20:59:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SC44/Mistral-7B-private-oia Dataset automatically created during the evaluation run of model SC44/Mistral-7B-private-oia on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T20:57:18.847869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SC44/Mistral-7B-private-oia\n\n\n\nDataset automatically created during the evaluation run of model SC44/Mistral-7B-private-oia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T20:57:18.847869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SC44/Mistral-7B-private-oia\n\n\n\nDataset automatically created during the evaluation run of model SC44/Mistral-7B-private-oia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T20:57:18.847869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
1c3455cf85d480bc09a5e47fc405365b8948e254
Часть датасета `stingning/ultrachat`, переведенная на русский язык. Заказать перевод вашего датасета на любой язык мира: https://t.me/PyWebSol
PyWebSol/RussianUltrachat100k
[ "task_categories:conversational", "task_categories:text-generation", "task_categories:question-answering", "language:ru", "license:apache-2.0", "region:us" ]
2024-01-28T21:19:17+00:00
{"language": ["ru"], "license": "apache-2.0", "task_categories": ["conversational", "text-generation", "question-answering"], "dataset_info": {"features": [{"name": "role", "sequence": "string"}, {"name": "content", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 1313144560, "num_examples": 100003}], "download_size": 551246373, "dataset_size": 1313144560}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-28T22:01:36+00:00
[]
[ "ru" ]
TAGS #task_categories-conversational #task_categories-text-generation #task_categories-question-answering #language-Russian #license-apache-2.0 #region-us
Часть датасета 'stingning/ultrachat', переведенная на русский язык. Заказать перевод вашего датасета на любой язык мира: https://t.me/PyWebSol
[]
[ "TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-question-answering #language-Russian #license-apache-2.0 #region-us \n" ]
bde3baab22abaf2194e9ccd9a7e3ae1c82d1c5a4
These are all the samples which failed to have a card generated with the correct formatting within 5 attempts. The included card is the last attempt. A card generation will fail if all these cases are not met: 1. Card must contain the correct tags starting the correct lines. 2. Card must not contain tags more than once. 3. Card must not contain any spaces before new lines. 4. Card must not contain any unwanted tags. (`'<|system|>', '<|user|>', '<|model|>', '<|FIRST_CHARACTER_MESSAGE|>', '<|SECOND_CHARACTER_MESSAGE|>'`) 5. Card must be generated with a stop token to complete. *Note: Samples which would end up longer than 8192 tokens, or samples with less than 4 turns were ignored completely and not saved here.*
PJMixers/grimulkan_bluemoon_Karen_cleaned-carded-failures
[ "source_datasets:grimulkan/bluemoon_Karen_cleaned", "language:en", "not-for-all-audiences", "region:us" ]
2024-01-28T21:26:06+00:00
{"language": ["en"], "source_datasets": "grimulkan/bluemoon_Karen_cleaned", "tags": ["not-for-all-audiences"]}
2024-01-29T21:34:31+00:00
[]
[ "en" ]
TAGS #source_datasets-grimulkan/bluemoon_Karen_cleaned #language-English #not-for-all-audiences #region-us
These are all the samples which failed to have a card generated with the correct formatting within 5 attempts. The included card is the last attempt. A card generation will fail if all these cases are not met: 1. Card must contain the correct tags starting the correct lines. 2. Card must not contain tags more than once. 3. Card must not contain any spaces before new lines. 4. Card must not contain any unwanted tags. (''<|system|>', '<|user|>', '<|model|>', '<|FIRST_CHARACTER_MESSAGE|>', '<|SECOND_CHARACTER_MESSAGE|>'') 5. Card must be generated with a stop token to complete. *Note: Samples which would end up longer than 8192 tokens, or samples with less than 4 turns were ignored completely and not saved here.*
[]
[ "TAGS\n#source_datasets-grimulkan/bluemoon_Karen_cleaned #language-English #not-for-all-audiences #region-us \n" ]
fc665c4f130507ded5f96da43ba58a9bff2c368a
# Что это/откуда тут чаты? Это мерж датасетов - ultrachat - no_robots - SiberiaSoft/SiberianPersonaChat - russian_dialogues # Зачем? я это сделал для своей попытки файнтюна мистрала 7б # Формат датасет собран в формате сообщений оаи ``` [ { 'role':'user', 'content':'...' }, { 'role':'assistant, 'content':'...' } ] ``` и храниться файликом .jsonl. вот так его переконвертировать в формат антропиков (ваще не антропиков ибо у них не User, а Human, но неважно) ```python def format_oai(messages): chat_seq = '' for i in messages: role = f'{i["role"][0].upper()}{i["role"][1:]}' chat_seq += f'\n\n{role}: {i["content"]}' chat_seq += '\n\nUser: ' return chat_seq ```
ddosxd/merge
[ "size_categories:1M<n<10M", "language:en", "language:ru", "region:us" ]
2024-01-28T22:16:23+00:00
{"language": ["en", "ru"], "size_categories": ["1M<n<10M"], "pretty_name": "Merge"}
2024-01-28T22:52:49+00:00
[]
[ "en", "ru" ]
TAGS #size_categories-1M<n<10M #language-English #language-Russian #region-us
# Что это/откуда тут чаты? Это мерж датасетов - ultrachat - no_robots - SiberiaSoft/SiberianPersonaChat - russian_dialogues # Зачем? я это сделал для своей попытки файнтюна мистрала 7б # Формат датасет собран в формате сообщений оаи и храниться файликом .jsonl. вот так его переконвертировать в формат антропиков (ваще не антропиков ибо у них не User, а Human, но неважно)
[ "# Что это/откуда тут чаты?\n\nЭто мерж датасетов \n - ultrachat\n - no_robots\n - SiberiaSoft/SiberianPersonaChat\n - russian_dialogues", "# Зачем? \n\nя это сделал для своей попытки файнтюна мистрала 7б", "# Формат\n\nдатасет собран в формате сообщений оаи\n\n\n\nи храниться файликом .jsonl.\n\nвот так его переконвертировать в формат антропиков\n (ваще не антропиков ибо у них не User, а Human, но неважно)" ]
[ "TAGS\n#size_categories-1M<n<10M #language-English #language-Russian #region-us \n", "# Что это/откуда тут чаты?\n\nЭто мерж датасетов \n - ultrachat\n - no_robots\n - SiberiaSoft/SiberianPersonaChat\n - russian_dialogues", "# Зачем? \n\nя это сделал для своей попытки файнтюна мистрала 7б", "# Формат\n\nдатасет собран в формате сообщений оаи\n\n\n\nи храниться файликом .jsonl.\n\nвот так его переконвертировать в формат антропиков\n (ваще не антропиков ибо у них не User, а Human, но неважно)" ]
98002218edc4563bb74b8d3a22f791e85ba3dd2e
# Dataset Card for Evaluation run of tourist800/Mistral-7B-Merge-14-v0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [tourist800/Mistral-7B-Merge-14-v0.2](https://huggingface.co/tourist800/Mistral-7B-Merge-14-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_tourist800__Mistral-7B-Merge-14-v0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T22:20:57.442584](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__Mistral-7B-Merge-14-v0.2/blob/main/results_2024-01-28T22-20-57.442584.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6451093257176218, "acc_stderr": 0.03217531209458113, "acc_norm": 0.6472202105283938, "acc_norm_stderr": 0.03281997697766516, "mc1": 0.39412484700122397, "mc1_stderr": 0.017106588140700322, "mc2": 0.541523548604347, "mc2_stderr": 0.015563769430078258 }, "harness|arc:challenge|25": { "acc": 0.6339590443686007, "acc_stderr": 0.01407722310847014, "acc_norm": 0.6501706484641638, "acc_norm_stderr": 0.013936809212158296 }, "harness|hellaswag|10": { "acc": 0.6721768571997611, "acc_stderr": 0.004684606310642329, "acc_norm": 0.8513244373630751, "acc_norm_stderr": 0.0035504128916474488 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.042446332383532265, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.042446332383532265 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493857, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493857 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.75, "acc_stderr": 0.03621034121889507, "acc_norm": 0.75, "acc_norm_stderr": 0.03621034121889507 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6242774566473989, "acc_stderr": 0.036928207672648664, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.036928207672648664 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.048108401480826346, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.048108401480826346 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.02525303255499769, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.02525303255499769 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.02315787934908352, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.02315787934908352 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5270935960591133, "acc_stderr": 0.03512819077876106, "acc_norm": 0.5270935960591133, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.032876667586034906, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.032876667586034906 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.02381447708659355, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.02381447708659355 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948485, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059285, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059285 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8238532110091743, "acc_stderr": 0.016332882393431395, "acc_norm": 0.8238532110091743, "acc_norm_stderr": 0.016332882393431395 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.02732547096671631, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.02732547096671631 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229146, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229146 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070417, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.033519538795212696, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.033519538795212696 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5178571428571429, "acc_stderr": 0.047427623612430116, "acc_norm": 0.5178571428571429, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.037601780060266196, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.037601780060266196 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8122605363984674, "acc_stderr": 0.01396439376989914, "acc_norm": 0.8122605363984674, "acc_norm_stderr": 0.01396439376989914 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7225433526011561, "acc_stderr": 0.02410571260775431, "acc_norm": 0.7225433526011561, "acc_norm_stderr": 0.02410571260775431 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38994413407821227, "acc_stderr": 0.01631237662921307, "acc_norm": 0.38994413407821227, "acc_norm_stderr": 0.01631237662921307 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.02526169121972948, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.02526169121972948 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7234726688102894, "acc_stderr": 0.02540383297817961, "acc_norm": 0.7234726688102894, "acc_norm_stderr": 0.02540383297817961 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4661016949152542, "acc_stderr": 0.012740853872949829, "acc_norm": 0.4661016949152542, "acc_norm_stderr": 0.012740853872949829 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7095588235294118, "acc_stderr": 0.02757646862274054, "acc_norm": 0.7095588235294118, "acc_norm_stderr": 0.02757646862274054 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.027529637440174923, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.027529637440174923 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8656716417910447, "acc_stderr": 0.024112678240900798, "acc_norm": 0.8656716417910447, "acc_norm_stderr": 0.024112678240900798 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.034873508801977704, "acc_norm": 0.86, "acc_norm_stderr": 0.034873508801977704 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.02954774168764004, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.02954774168764004 }, "harness|truthfulqa:mc|0": { "mc1": 0.39412484700122397, "mc1_stderr": 0.017106588140700322, "mc2": 0.541523548604347, "mc2_stderr": 0.015563769430078258 }, "harness|winogrande|5": { "acc": 0.7924230465666929, "acc_stderr": 0.011398593419386776 }, "harness|gsm8k|5": { "acc": 0.5686125852918877, "acc_stderr": 0.013642195352511564 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_tourist800__Mistral-7B-Merge-14-v0.2
[ "region:us" ]
2024-01-28T22:23:22+00:00
{"pretty_name": "Evaluation run of tourist800/Mistral-7B-Merge-14-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [tourist800/Mistral-7B-Merge-14-v0.2](https://huggingface.co/tourist800/Mistral-7B-Merge-14-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tourist800__Mistral-7B-Merge-14-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T22:20:57.442584](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__Mistral-7B-Merge-14-v0.2/blob/main/results_2024-01-28T22-20-57.442584.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6451093257176218,\n \"acc_stderr\": 0.03217531209458113,\n \"acc_norm\": 0.6472202105283938,\n \"acc_norm_stderr\": 0.03281997697766516,\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.541523548604347,\n \"mc2_stderr\": 0.015563769430078258\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.01407722310847014,\n \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158296\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6721768571997611,\n \"acc_stderr\": 0.004684606310642329,\n \"acc_norm\": 0.8513244373630751,\n \"acc_norm_stderr\": 0.0035504128916474488\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908352,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431395,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431395\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.01396439376989914,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.01396439376989914\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n \"acc_stderr\": 0.01631237662921307,\n \"acc_norm\": 0.38994413407821227,\n \"acc_norm_stderr\": 0.01631237662921307\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.012740853872949829,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.012740853872949829\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274054,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274054\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n \"mc1_stderr\": 0.017106588140700322,\n \"mc2\": 0.541523548604347,\n \"mc2_stderr\": 0.015563769430078258\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \"acc_stderr\": 0.013642195352511564\n }\n}\n```", "repo_url": "https://huggingface.co/tourist800/Mistral-7B-Merge-14-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|arc:challenge|25_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|gsm8k|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hellaswag|10_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T22-20-57.442584.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["**/details_harness|winogrande|5_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T22-20-57.442584.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T22_20_57.442584", "path": ["results_2024-01-28T22-20-57.442584.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T22-20-57.442584.parquet"]}]}]}
2024-01-28T22:23:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of tourist800/Mistral-7B-Merge-14-v0.2 Dataset automatically created during the evaluation run of model tourist800/Mistral-7B-Merge-14-v0.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T22:20:57.442584(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of tourist800/Mistral-7B-Merge-14-v0.2\n\n\n\nDataset automatically created during the evaluation run of model tourist800/Mistral-7B-Merge-14-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T22:20:57.442584(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of tourist800/Mistral-7B-Merge-14-v0.2\n\n\n\nDataset automatically created during the evaluation run of model tourist800/Mistral-7B-Merge-14-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T22:20:57.442584(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
820959e0bc144b3a5860f6f22fd60931347529e6
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-base-ia <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-base-ia](https://huggingface.co/SC99/Mistral-7B-privatemix-base-ia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-base-ia", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T22:55:47.919110](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-base-ia/blob/main/results_2024-01-28T22-55-47.919110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6059609377950763, "acc_stderr": 0.0331864834631949, "acc_norm": 0.6099468548049356, "acc_norm_stderr": 0.03385489058934447, "mc1": 0.5385556915544676, "mc1_stderr": 0.017451384104637455, "mc2": 0.6875853718360632, "mc2_stderr": 0.015163177769189197 }, "harness|arc:challenge|25": { "acc": 0.5793515358361775, "acc_stderr": 0.014426211252508392, "acc_norm": 0.6279863481228669, "acc_norm_stderr": 0.014124597881844461 }, "harness|hellaswag|10": { "acc": 0.663114917347142, "acc_stderr": 0.004716792874433208, "acc_norm": 0.8485361481776539, "acc_norm_stderr": 0.0035776774950640818 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.27, "acc_stderr": 0.04461960433384741, "acc_norm": 0.27, "acc_norm_stderr": 0.04461960433384741 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.02872750295788027, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.02872750295788027 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6944444444444444, "acc_stderr": 0.03852084696008534, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.03852084696008534 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956913, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956913 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5722543352601156, "acc_stderr": 0.037724468575180255, "acc_norm": 0.5722543352601156, "acc_norm_stderr": 0.037724468575180255 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5319148936170213, "acc_stderr": 0.03261936918467382, "acc_norm": 0.5319148936170213, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37566137566137564, "acc_stderr": 0.024942368931159788, "acc_norm": 0.37566137566137564, "acc_norm_stderr": 0.024942368931159788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6838709677419355, "acc_stderr": 0.02645087448904277, "acc_norm": 0.6838709677419355, "acc_norm_stderr": 0.02645087448904277 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7393939393939394, "acc_stderr": 0.034277431758165236, "acc_norm": 0.7393939393939394, "acc_norm_stderr": 0.034277431758165236 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03053289223393202, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03053289223393202 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397457, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397457 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5897435897435898, "acc_stderr": 0.024939313906940788, "acc_norm": 0.5897435897435898, "acc_norm_stderr": 0.024939313906940788 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066482, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066482 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6512605042016807, "acc_stderr": 0.030956636328566548, "acc_norm": 0.6512605042016807, "acc_norm_stderr": 0.030956636328566548 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.038969819642573754, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.038969819642573754 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7944954128440367, "acc_stderr": 0.01732435232501602, "acc_norm": 0.7944954128440367, "acc_norm_stderr": 0.01732435232501602 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.03385177976044811, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.03385177976044811 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591361, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6098654708520179, "acc_stderr": 0.03273766725459156, "acc_norm": 0.6098654708520179, "acc_norm_stderr": 0.03273766725459156 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.038808483010823944, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.038808483010823944 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.04414343666854933, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.04414343666854933 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.034624199316156234, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.034624199316156234 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8461538461538461, "acc_stderr": 0.0236368733174893, "acc_norm": 0.8461538461538461, "acc_norm_stderr": 0.0236368733174893 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7816091954022989, "acc_stderr": 0.014774358319934486, "acc_norm": 0.7816091954022989, "acc_norm_stderr": 0.014774358319934486 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6820809248554913, "acc_stderr": 0.02507071371915319, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.02507071371915319 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2837988826815642, "acc_stderr": 0.015078358970751745, "acc_norm": 0.2837988826815642, "acc_norm_stderr": 0.015078358970751745 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6895424836601307, "acc_stderr": 0.026493033225145898, "acc_norm": 0.6895424836601307, "acc_norm_stderr": 0.026493033225145898 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464485, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464485 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.691358024691358, "acc_stderr": 0.025702640260603746, "acc_norm": 0.691358024691358, "acc_norm_stderr": 0.025702640260603746 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4716312056737589, "acc_stderr": 0.029779450957303062, "acc_norm": 0.4716312056737589, "acc_norm_stderr": 0.029779450957303062 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4315514993481095, "acc_stderr": 0.012650007999463872, "acc_norm": 0.4315514993481095, "acc_norm_stderr": 0.012650007999463872 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.625, "acc_stderr": 0.029408372932278746, "acc_norm": 0.625, "acc_norm_stderr": 0.029408372932278746 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6405228758169934, "acc_stderr": 0.01941253924203216, "acc_norm": 0.6405228758169934, "acc_norm_stderr": 0.01941253924203216 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.02904308868330433, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.02904308868330433 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7611940298507462, "acc_stderr": 0.030147775935409224, "acc_norm": 0.7611940298507462, "acc_norm_stderr": 0.030147775935409224 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366255, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.03891364495835821, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.03891364495835821 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5385556915544676, "mc1_stderr": 0.017451384104637455, "mc2": 0.6875853718360632, "mc2_stderr": 0.015163177769189197 }, "harness|winogrande|5": { "acc": 0.7703235990528808, "acc_stderr": 0.011821645601838229 }, "harness|gsm8k|5": { "acc": 0.4404852160727824, "acc_stderr": 0.013674572131693884 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-base-ia
[ "region:us" ]
2024-01-28T22:58:10+00:00
{"pretty_name": "Evaluation run of SC99/Mistral-7B-privatemix-base-ia", "dataset_summary": "Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-base-ia](https://huggingface.co/SC99/Mistral-7B-privatemix-base-ia) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-base-ia\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T22:55:47.919110](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-base-ia/blob/main/results_2024-01-28T22-55-47.919110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6059609377950763,\n \"acc_stderr\": 0.0331864834631949,\n \"acc_norm\": 0.6099468548049356,\n \"acc_norm_stderr\": 0.03385489058934447,\n \"mc1\": 0.5385556915544676,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6875853718360632,\n \"mc2_stderr\": 0.015163177769189197\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.014426211252508392,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.663114917347142,\n \"acc_stderr\": 0.004716792874433208,\n \"acc_norm\": 0.8485361481776539,\n \"acc_norm_stderr\": 0.0035776774950640818\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.6838709677419355,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397457,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397457\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940788,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940788\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501602,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501602\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.6098654708520179,\n \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.0236368733174893,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.0236368733174893\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n \"acc_stderr\": 0.014774358319934486,\n \"acc_norm\": 0.7816091954022989,\n \"acc_norm_stderr\": 0.014774358319934486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.02507071371915319,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.02507071371915319\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n \"acc_stderr\": 0.015078358970751745,\n \"acc_norm\": 0.2837988826815642,\n \"acc_norm_stderr\": 0.015078358970751745\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n \"acc_stderr\": 0.012650007999463872,\n \"acc_norm\": 0.4315514993481095,\n \"acc_norm_stderr\": 0.012650007999463872\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5385556915544676,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6875853718360632,\n \"mc2_stderr\": 0.015163177769189197\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4404852160727824,\n \"acc_stderr\": 0.013674572131693884\n }\n}\n```", "repo_url": "https://huggingface.co/SC99/Mistral-7B-privatemix-base-ia", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|arc:challenge|25_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|gsm8k|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hellaswag|10_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T22-55-47.919110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["**/details_harness|winogrande|5_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T22-55-47.919110.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T22_55_47.919110", "path": ["results_2024-01-28T22-55-47.919110.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T22-55-47.919110.parquet"]}]}]}
2024-01-28T22:58:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-base-ia Dataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-base-ia on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T22:55:47.919110(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-base-ia\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-base-ia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T22:55:47.919110(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-base-ia\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-base-ia on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T22:55:47.919110(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5f269a3cda8ca0b187376e03e75d2b7596e754a4
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia1](https://huggingface.co/SC99/Mistral-7B-privatemix-ia1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T23:00:45.925269](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1/blob/main/results_2024-01-28T23-00-45.925269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6514069537640662, "acc_stderr": 0.03224835259879914, "acc_norm": 0.6505037607853619, "acc_norm_stderr": 0.03293066455457689, "mc1": 0.5716034271725826, "mc1_stderr": 0.017323088597314743, "mc2": 0.7178902486503331, "mc2_stderr": 0.014856727473105872 }, "harness|arc:challenge|25": { "acc": 0.7141638225255973, "acc_stderr": 0.01320319608853737, "acc_norm": 0.7278156996587031, "acc_norm_stderr": 0.013006600406423702 }, "harness|hellaswag|10": { "acc": 0.7088229436367257, "acc_stderr": 0.004533764686211992, "acc_norm": 0.8858793069109739, "acc_norm_stderr": 0.003173079807440182 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.049135952012744975, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.049135952012744975 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.03208115750788684, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.02530590624159063, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.02530590624159063 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726854, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726854 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.03287666758603491, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.03287666758603491 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6820512820512821, "acc_stderr": 0.023610884308927865, "acc_norm": 0.6820512820512821, "acc_norm_stderr": 0.023610884308927865 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251972, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251972 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8311926605504587, "acc_stderr": 0.016060056268530343, "acc_norm": 0.8311926605504587, "acc_norm_stderr": 0.016060056268530343 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159463, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159463 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228732, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228732 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368982, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368982 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500097, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500097 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39553072625698327, "acc_stderr": 0.016353415410075775, "acc_norm": 0.39553072625698327, "acc_norm_stderr": 0.016353415410075775 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7091503267973857, "acc_stderr": 0.02600480036395213, "acc_norm": 0.7091503267973857, "acc_norm_stderr": 0.02600480036395213 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.02592237178881877, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.02592237178881877 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46153846153846156, "acc_stderr": 0.01273239828619044, "acc_norm": 0.46153846153846156, "acc_norm_stderr": 0.01273239828619044 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6507352941176471, "acc_stderr": 0.028959755196824873, "acc_norm": 0.6507352941176471, "acc_norm_stderr": 0.028959755196824873 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6813725490196079, "acc_stderr": 0.01885008469646872, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.01885008469646872 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291293, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291293 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.5716034271725826, "mc1_stderr": 0.017323088597314743, "mc2": 0.7178902486503331, "mc2_stderr": 0.014856727473105872 }, "harness|winogrande|5": { "acc": 0.850828729281768, "acc_stderr": 0.0100125988056273 }, "harness|gsm8k|5": { "acc": 0.6959818043972706, "acc_stderr": 0.012670420440198681 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1
[ "region:us" ]
2024-01-28T23:03:04+00:00
{"pretty_name": "Evaluation run of SC99/Mistral-7B-privatemix-ia1", "dataset_summary": "Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia1](https://huggingface.co/SC99/Mistral-7B-privatemix-ia1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T23:00:45.925269](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1/blob/main/results_2024-01-28T23-00-45.925269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6514069537640662,\n \"acc_stderr\": 0.03224835259879914,\n \"acc_norm\": 0.6505037607853619,\n \"acc_norm_stderr\": 0.03293066455457689,\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7178902486503331,\n \"mc2_stderr\": 0.014856727473105872\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.01320319608853737,\n \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7088229436367257,\n \"acc_stderr\": 0.004533764686211992,\n \"acc_norm\": 0.8858793069109739,\n \"acc_norm_stderr\": 0.003173079807440182\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7178902486503331,\n \"mc2_stderr\": 0.014856727473105872\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.0100125988056273\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \"acc_stderr\": 0.012670420440198681\n }\n}\n```", "repo_url": "https://huggingface.co/SC99/Mistral-7B-privatemix-ia1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["**/details_harness|winogrande|5_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T23-00-45.925269.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T23_00_45.925269", "path": ["results_2024-01-28T23-00-45.925269.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T23-00-45.925269.parquet"]}]}]}
2024-01-28T23:03:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia1 Dataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T23:00:45.925269(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia1\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:00:45.925269(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia1\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:00:45.925269(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
71969ec57a6d9653d244730300e1155cfb650025
Formatted from [PygmalionAI/PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA) and [royallab/PIPPA-cleaned](https://huggingface.co/datasets/royallab/PIPPA-cleaned).
kootszhin/PIPPA-cleaned-formatted
[ "not-for-all-audiences", "region:us" ]
2024-01-28T23:04:48+00:00
{"tags": ["not-for-all-audiences"]}
2024-01-31T00:48:21+00:00
[]
[]
TAGS #not-for-all-audiences #region-us
Formatted from PygmalionAI/PIPPA and royallab/PIPPA-cleaned.
[]
[ "TAGS\n#not-for-all-audiences #region-us \n" ]
c362c84bb1e76ee84bce1009466717e96a37ec65
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia2](https://huggingface.co/SC99/Mistral-7B-privatemix-ia2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T23:07:03.566667](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia2/blob/main/results_2024-01-28T23-07-03.566667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6511206736607854, "acc_stderr": 0.03216914750872735, "acc_norm": 0.6506414688325837, "acc_norm_stderr": 0.032840632184423964, "mc1": 0.5593635250917993, "mc1_stderr": 0.017379697555437442, "mc2": 0.7132504580188571, "mc2_stderr": 0.01483377142815265 }, "harness|arc:challenge|25": { "acc": 0.7022184300341296, "acc_stderr": 0.013363080107244484, "acc_norm": 0.7226962457337884, "acc_norm_stderr": 0.013082095839059376 }, "harness|hellaswag|10": { "acc": 0.7087233618801035, "acc_stderr": 0.004534221350046108, "acc_norm": 0.8858793069109739, "acc_norm_stderr": 0.00317307980744018 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.03586879280080341, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.03208115750788684, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.047028804320496165, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778405, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778405 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.023287665127268535, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.023287665127268535 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4876847290640394, "acc_stderr": 0.035169204442208966, "acc_norm": 0.4876847290640394, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.03287666758603491, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.03287666758603491 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267042, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267042 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34814814814814815, "acc_stderr": 0.029045600290616255, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.029045600290616255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.030684737115135363, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.030684737115135363 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8348623853211009, "acc_stderr": 0.01591955782997604, "acc_norm": 0.8348623853211009, "acc_norm_stderr": 0.01591955782997604 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474082, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.02595502084162113, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.02595502084162113 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098823, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098823 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8888888888888888, "acc_stderr": 0.020588491316092375, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.020588491316092375 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066297, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066297 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7283236994219653, "acc_stderr": 0.02394851290546836, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.02394851290546836 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.39776536312849164, "acc_stderr": 0.01636920497126299, "acc_norm": 0.39776536312849164, "acc_norm_stderr": 0.01636920497126299 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.696078431372549, "acc_stderr": 0.02633661346904663, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.02633661346904663 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4706649282920469, "acc_stderr": 0.012748238397365549, "acc_norm": 0.4706649282920469, "acc_norm_stderr": 0.012748238397365549 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983572, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983572 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6781045751633987, "acc_stderr": 0.018901015322093092, "acc_norm": 0.6781045751633987, "acc_norm_stderr": 0.018901015322093092 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5593635250917993, "mc1_stderr": 0.017379697555437442, "mc2": 0.7132504580188571, "mc2_stderr": 0.01483377142815265 }, "harness|winogrande|5": { "acc": 0.8389897395422258, "acc_stderr": 0.010329712832785722 }, "harness|gsm8k|5": { "acc": 0.689158453373768, "acc_stderr": 0.012748860507777718 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia2
[ "region:us" ]
2024-01-28T23:09:25+00:00
{"pretty_name": "Evaluation run of SC99/Mistral-7B-privatemix-ia2", "dataset_summary": "Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia2](https://huggingface.co/SC99/Mistral-7B-privatemix-ia2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T23:07:03.566667](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia2/blob/main/results_2024-01-28T23-07-03.566667.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511206736607854,\n \"acc_stderr\": 0.03216914750872735,\n \"acc_norm\": 0.6506414688325837,\n \"acc_norm_stderr\": 0.032840632184423964,\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437442,\n \"mc2\": 0.7132504580188571,\n \"mc2_stderr\": 0.01483377142815265\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244484,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7087233618801035,\n \"acc_stderr\": 0.004534221350046108,\n \"acc_norm\": 0.8858793069109739,\n \"acc_norm_stderr\": 0.00317307980744018\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268535,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268535\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.01591955782997604,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.01591955782997604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066297,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066297\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39776536312849164,\n \"acc_stderr\": 0.01636920497126299,\n \"acc_norm\": 0.39776536312849164,\n \"acc_norm_stderr\": 0.01636920497126299\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904663,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904663\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n \"mc1_stderr\": 0.017379697555437442,\n \"mc2\": 0.7132504580188571,\n \"mc2_stderr\": 0.01483377142815265\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.689158453373768,\n \"acc_stderr\": 0.012748860507777718\n }\n}\n```", "repo_url": "https://huggingface.co/SC99/Mistral-7B-privatemix-ia2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-07-03.566667.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["**/details_harness|winogrande|5_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T23-07-03.566667.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T23_07_03.566667", "path": ["results_2024-01-28T23-07-03.566667.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T23-07-03.566667.parquet"]}]}]}
2024-01-28T23:09:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia2 Dataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T23:07:03.566667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia2\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:07:03.566667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia2\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:07:03.566667(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
17cb3ad2ed344bbcaf89ffdb45e2aeaf7f1553d2
# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.5b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [migtissera/Tess-34B-v1.5b](https://huggingface.co/migtissera/Tess-34B-v1.5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T23:12:19.798626](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b/blob/main/results_2024-01-28T23-12-19.798626.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7571140160311144, "acc_stderr": 0.028404294310283486, "acc_norm": 0.7619075094631577, "acc_norm_stderr": 0.028933953410883263, "mc1": 0.3733170134638923, "mc1_stderr": 0.01693237055757063, "mc2": 0.5312154281103626, "mc2_stderr": 0.015485998460539758 }, "harness|arc:challenge|25": { "acc": 0.6177474402730375, "acc_stderr": 0.014200454049979277, "acc_norm": 0.6390784982935154, "acc_norm_stderr": 0.014034761386175449 }, "harness|hellaswag|10": { "acc": 0.6555467038438558, "acc_stderr": 0.004742185169264772, "acc_norm": 0.8442541326428998, "acc_norm_stderr": 0.0036187316588377205 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.7185185185185186, "acc_stderr": 0.038850042458002526, "acc_norm": 0.7185185185185186, "acc_norm_stderr": 0.038850042458002526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.875, "acc_stderr": 0.026913523521537846, "acc_norm": 0.875, "acc_norm_stderr": 0.026913523521537846 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7811320754716982, "acc_stderr": 0.025447863825108594, "acc_norm": 0.7811320754716982, "acc_norm_stderr": 0.025447863825108594 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8611111111111112, "acc_stderr": 0.028919802956134905, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.028919802956134905 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7283236994219653, "acc_stderr": 0.033917503223216586, "acc_norm": 0.7283236994219653, "acc_norm_stderr": 0.033917503223216586 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5490196078431373, "acc_stderr": 0.04951218252396262, "acc_norm": 0.5490196078431373, "acc_norm_stderr": 0.04951218252396262 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.82, "acc_stderr": 0.038612291966536955, "acc_norm": 0.82, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.7914893617021277, "acc_stderr": 0.026556982117838746, "acc_norm": 0.7914893617021277, "acc_norm_stderr": 0.026556982117838746 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6403508771929824, "acc_stderr": 0.04514496132873633, "acc_norm": 0.6403508771929824, "acc_norm_stderr": 0.04514496132873633 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7448275862068966, "acc_stderr": 0.03632984052707842, "acc_norm": 0.7448275862068966, "acc_norm_stderr": 0.03632984052707842 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.6640211640211641, "acc_stderr": 0.024326310529149145, "acc_norm": 0.6640211640211641, "acc_norm_stderr": 0.024326310529149145 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5634920634920635, "acc_stderr": 0.04435932892851466, "acc_norm": 0.5634920634920635, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9225806451612903, "acc_stderr": 0.015203644420774848, "acc_norm": 0.9225806451612903, "acc_norm_stderr": 0.015203644420774848 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6798029556650246, "acc_stderr": 0.032826493853041504, "acc_norm": 0.6798029556650246, "acc_norm_stderr": 0.032826493853041504 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066584, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066584 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9292929292929293, "acc_stderr": 0.01826310542019949, "acc_norm": 0.9292929292929293, "acc_norm_stderr": 0.01826310542019949 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9689119170984456, "acc_stderr": 0.012525310625527034, "acc_norm": 0.9689119170984456, "acc_norm_stderr": 0.012525310625527034 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8179487179487179, "acc_stderr": 0.019565236782930883, "acc_norm": 0.8179487179487179, "acc_norm_stderr": 0.019565236782930883 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.43703703703703706, "acc_stderr": 0.030242862397654002, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.030242862397654002 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8487394957983193, "acc_stderr": 0.023274255898707952, "acc_norm": 0.8487394957983193, "acc_norm_stderr": 0.023274255898707952 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4768211920529801, "acc_stderr": 0.04078093859163083, "acc_norm": 0.4768211920529801, "acc_norm_stderr": 0.04078093859163083 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9247706422018349, "acc_stderr": 0.011308662537571743, "acc_norm": 0.9247706422018349, "acc_norm_stderr": 0.011308662537571743 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6388888888888888, "acc_stderr": 0.032757734861009996, "acc_norm": 0.6388888888888888, "acc_norm_stderr": 0.032757734861009996 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9166666666666666, "acc_stderr": 0.019398452135813905, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.019398452135813905 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9240506329113924, "acc_stderr": 0.0172446332510657, "acc_norm": 0.9240506329113924, "acc_norm_stderr": 0.0172446332510657 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.02693611191280227, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.02693611191280227 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8702290076335878, "acc_stderr": 0.029473649496907065, "acc_norm": 0.8702290076335878, "acc_norm_stderr": 0.029473649496907065 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8842975206611571, "acc_stderr": 0.029199802455622814, "acc_norm": 0.8842975206611571, "acc_norm_stderr": 0.029199802455622814 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8611111111111112, "acc_stderr": 0.03343270062869622, "acc_norm": 0.8611111111111112, "acc_norm_stderr": 0.03343270062869622 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8711656441717791, "acc_stderr": 0.026321383198783674, "acc_norm": 0.8711656441717791, "acc_norm_stderr": 0.026321383198783674 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5982142857142857, "acc_stderr": 0.04653333146973647, "acc_norm": 0.5982142857142857, "acc_norm_stderr": 0.04653333146973647 }, "harness|hendrycksTest-management|5": { "acc": 0.8932038834951457, "acc_stderr": 0.030581088928331356, "acc_norm": 0.8932038834951457, "acc_norm_stderr": 0.030581088928331356 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9401709401709402, "acc_stderr": 0.015537514263253876, "acc_norm": 0.9401709401709402, "acc_norm_stderr": 0.015537514263253876 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.86, "acc_stderr": 0.03487350880197769, "acc_norm": 0.86, "acc_norm_stderr": 0.03487350880197769 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9067688378033205, "acc_stderr": 0.01039741708729285, "acc_norm": 0.9067688378033205, "acc_norm_stderr": 0.01039741708729285 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.815028901734104, "acc_stderr": 0.02090397584208303, "acc_norm": 0.815028901734104, "acc_norm_stderr": 0.02090397584208303 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.7229050279329609, "acc_stderr": 0.01496877243581214, "acc_norm": 0.7229050279329609, "acc_norm_stderr": 0.01496877243581214 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8464052287581699, "acc_stderr": 0.020645597910418763, "acc_norm": 0.8464052287581699, "acc_norm_stderr": 0.020645597910418763 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8070739549839229, "acc_stderr": 0.022411516780911366, "acc_norm": 0.8070739549839229, "acc_norm_stderr": 0.022411516780911366 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8703703703703703, "acc_stderr": 0.018689725721062072, "acc_norm": 0.8703703703703703, "acc_norm_stderr": 0.018689725721062072 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6134751773049646, "acc_stderr": 0.029049190342543465, "acc_norm": 0.6134751773049646, "acc_norm_stderr": 0.029049190342543465 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5958279009126467, "acc_stderr": 0.012533504046491367, "acc_norm": 0.5958279009126467, "acc_norm_stderr": 0.012533504046491367 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8345588235294118, "acc_stderr": 0.02257177102549475, "acc_norm": 0.8345588235294118, "acc_norm_stderr": 0.02257177102549475 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.8251633986928104, "acc_stderr": 0.015366167064780655, "acc_norm": 0.8251633986928104, "acc_norm_stderr": 0.015366167064780655 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7181818181818181, "acc_stderr": 0.043091187099464585, "acc_norm": 0.7181818181818181, "acc_norm_stderr": 0.043091187099464585 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8204081632653061, "acc_stderr": 0.024573293589585633, "acc_norm": 0.8204081632653061, "acc_norm_stderr": 0.024573293589585633 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8955223880597015, "acc_stderr": 0.021628920516700643, "acc_norm": 0.8955223880597015, "acc_norm_stderr": 0.021628920516700643 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.92, "acc_stderr": 0.0272659924344291, "acc_norm": 0.92, "acc_norm_stderr": 0.0272659924344291 }, "harness|hendrycksTest-virology|5": { "acc": 0.5963855421686747, "acc_stderr": 0.03819486140758398, "acc_norm": 0.5963855421686747, "acc_norm_stderr": 0.03819486140758398 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8771929824561403, "acc_stderr": 0.02517298435015577, "acc_norm": 0.8771929824561403, "acc_norm_stderr": 0.02517298435015577 }, "harness|truthfulqa:mc|0": { "mc1": 0.3733170134638923, "mc1_stderr": 0.01693237055757063, "mc2": 0.5312154281103626, "mc2_stderr": 0.015485998460539758 }, "harness|winogrande|5": { "acc": 0.8129439621152328, "acc_stderr": 0.01095971643524291 }, "harness|gsm8k|5": { "acc": 0.6285064442759667, "acc_stderr": 0.01330983907570649 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b
[ "region:us" ]
2024-01-28T23:14:33+00:00
{"pretty_name": "Evaluation run of migtissera/Tess-34B-v1.5b", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-34B-v1.5b](https://huggingface.co/migtissera/Tess-34B-v1.5b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T23:12:19.798626](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-34B-v1.5b/blob/main/results_2024-01-28T23-12-19.798626.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7571140160311144,\n \"acc_stderr\": 0.028404294310283486,\n \"acc_norm\": 0.7619075094631577,\n \"acc_norm_stderr\": 0.028933953410883263,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5312154281103626,\n \"mc2_stderr\": 0.015485998460539758\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979277,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175449\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6555467038438558,\n \"acc_stderr\": 0.004742185169264772,\n \"acc_norm\": 0.8442541326428998,\n \"acc_norm_stderr\": 0.0036187316588377205\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.025447863825108594,\n \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.025447863825108594\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.028919802956134905,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.028919802956134905\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.033917503223216586,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.033917503223216586\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838746,\n \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838746\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6640211640211641,\n \"acc_stderr\": 0.024326310529149145,\n \"acc_norm\": 0.6640211640211641,\n \"acc_norm_stderr\": 0.024326310529149145\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9225806451612903,\n \"acc_stderr\": 0.015203644420774848,\n \"acc_norm\": 0.9225806451612903,\n \"acc_norm_stderr\": 0.015203644420774848\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6798029556650246,\n \"acc_stderr\": 0.032826493853041504,\n \"acc_norm\": 0.6798029556650246,\n \"acc_norm_stderr\": 0.032826493853041504\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019949,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019949\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527034,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527034\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.019565236782930883,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.019565236782930883\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707952,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707952\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571743,\n \"acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571743\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9240506329113924,\n \"acc_stderr\": 0.0172446332510657,\n \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.0172446332510657\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869622,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869622\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253876,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253876\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.01039741708729285,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.01039741708729285\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.02090397584208303,\n \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.02090397584208303\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7229050279329609,\n \"acc_stderr\": 0.01496877243581214,\n \"acc_norm\": 0.7229050279329609,\n \"acc_norm_stderr\": 0.01496877243581214\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.020645597910418763,\n \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.020645597910418763\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.8070739549839229,\n \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062072,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062072\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6134751773049646,\n \"acc_stderr\": 0.029049190342543465,\n \"acc_norm\": 0.6134751773049646,\n \"acc_norm_stderr\": 0.029049190342543465\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5958279009126467,\n \"acc_stderr\": 0.012533504046491367,\n \"acc_norm\": 0.5958279009126467,\n \"acc_norm_stderr\": 0.012533504046491367\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.02257177102549475,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.02257177102549475\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8251633986928104,\n \"acc_stderr\": 0.015366167064780655,\n \"acc_norm\": 0.8251633986928104,\n \"acc_norm_stderr\": 0.015366167064780655\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585633,\n \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585633\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5963855421686747,\n \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.5963855421686747,\n \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5312154281103626,\n \"mc2_stderr\": 0.015485998460539758\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.01095971643524291\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6285064442759667,\n \"acc_stderr\": 0.01330983907570649\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-34B-v1.5b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["**/details_harness|winogrande|5_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T23-12-19.798626.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T23_12_19.798626", "path": ["results_2024-01-28T23-12-19.798626.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T23-12-19.798626.parquet"]}]}]}
2024-01-28T23:14:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.5b Dataset automatically created during the evaluation run of model migtissera/Tess-34B-v1.5b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T23:12:19.798626(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.5b\n\n\n\nDataset automatically created during the evaluation run of model migtissera/Tess-34B-v1.5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:12:19.798626(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of migtissera/Tess-34B-v1.5b\n\n\n\nDataset automatically created during the evaluation run of model migtissera/Tess-34B-v1.5b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:12:19.798626(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
befd137da429ed72199db6318aa48c55d77c7843
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia3 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia3](https://huggingface.co/SC99/Mistral-7B-privatemix-ia3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-28T23:12:46.191755](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3/blob/main/results_2024-01-28T23-12-46.191755.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6477608633784878, "acc_stderr": 0.03220667400265738, "acc_norm": 0.6471674562866768, "acc_norm_stderr": 0.032886180571357726, "mc1": 0.565483476132191, "mc1_stderr": 0.01735273874925956, "mc2": 0.7012569167721324, "mc2_stderr": 0.01507953669061273 }, "harness|arc:challenge|25": { "acc": 0.7030716723549488, "acc_stderr": 0.013352025976725223, "acc_norm": 0.7337883959044369, "acc_norm_stderr": 0.012915774781523205 }, "harness|hellaswag|10": { "acc": 0.7150965943039235, "acc_stderr": 0.004504459553909765, "acc_norm": 0.8868751244771957, "acc_norm_stderr": 0.0031609804549511764 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266345, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266345 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5263157894736842, "acc_stderr": 0.046970851366478626, "acc_norm": 0.5263157894736842, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.025487187147859375, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.025487187147859375 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083522, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083522 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047711, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047711 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.02805779167298902, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.02805779167298902 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.029116617606083008, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.029116617606083008 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977945, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977945 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.015630022970092444, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.015630022970092444 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977748, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977748 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601436, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601436 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.031381476375754995, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.036412970813137296, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.036412970813137296 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098823, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098823 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094633, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094633 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179326, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179326 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973136, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973136 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7167630057803468, "acc_stderr": 0.024257901705323385, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.024257901705323385 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4011173184357542, "acc_stderr": 0.01639222189940708, "acc_norm": 0.4011173184357542, "acc_norm_stderr": 0.01639222189940708 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7124183006535948, "acc_stderr": 0.02591780611714716, "acc_norm": 0.7124183006535948, "acc_norm_stderr": 0.02591780611714716 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153266, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4634941329856584, "acc_stderr": 0.012736153390214963, "acc_norm": 0.4634941329856584, "acc_norm_stderr": 0.012736153390214963 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6580882352941176, "acc_stderr": 0.028814722422254184, "acc_norm": 0.6580882352941176, "acc_norm_stderr": 0.028814722422254184 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6601307189542484, "acc_stderr": 0.019162418588623553, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.019162418588623553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233278, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233278 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.565483476132191, "mc1_stderr": 0.01735273874925956, "mc2": 0.7012569167721324, "mc2_stderr": 0.01507953669061273 }, "harness|winogrande|5": { "acc": 0.8666140489344909, "acc_stderr": 0.009555448026422974 }, "harness|gsm8k|5": { "acc": 0.6664139499620925, "acc_stderr": 0.012987282131410809 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3
[ "region:us" ]
2024-01-28T23:15:05+00:00
{"pretty_name": "Evaluation run of SC99/Mistral-7B-privatemix-ia3", "dataset_summary": "Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia3](https://huggingface.co/SC99/Mistral-7B-privatemix-ia3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-28T23:12:46.191755](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia3/blob/main/results_2024-01-28T23-12-46.191755.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6477608633784878,\n \"acc_stderr\": 0.03220667400265738,\n \"acc_norm\": 0.6471674562866768,\n \"acc_norm_stderr\": 0.032886180571357726,\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7012569167721324,\n \"mc2_stderr\": 0.01507953669061273\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523205\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7150965943039235,\n \"acc_stderr\": 0.004504459553909765,\n \"acc_norm\": 0.8868751244771957,\n \"acc_norm_stderr\": 0.0031609804549511764\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977945,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977945\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323385,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323385\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n \"acc_stderr\": 0.01639222189940708,\n \"acc_norm\": 0.4011173184357542,\n \"acc_norm_stderr\": 0.01639222189940708\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233278,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233278\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.7012569167721324,\n \"mc2_stderr\": 0.01507953669061273\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8666140489344909,\n \"acc_stderr\": 0.009555448026422974\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6664139499620925,\n \"acc_stderr\": 0.012987282131410809\n }\n}\n```", "repo_url": "https://huggingface.co/SC99/Mistral-7B-privatemix-ia3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["**/details_harness|winogrande|5_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-28T23-12-46.191755.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_28T23_12_46.191755", "path": ["results_2024-01-28T23-12-46.191755.parquet"]}, {"split": "latest", "path": ["results_2024-01-28T23-12-46.191755.parquet"]}]}]}
2024-01-28T23:15:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia3 Dataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia3 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-28T23:12:46.191755(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia3\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:12:46.191755(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia3\n\n\n\nDataset automatically created during the evaluation run of model SC99/Mistral-7B-privatemix-ia3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-28T23:12:46.191755(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0992c69a898fcba16c442769f464bb433d0be736
# oasst1-21k-en This repository provides an instruction tuning dataset developed by [LLM-jp](https://llm-jp.nii.ac.jp/), a collaborative project launched in Japan. This dataset is an English subset of [oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1). ## Send Questions to llm-jp(at)nii.ac.jp ## Model Card Authors *The names are listed in alphabetical order.* Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto.
llm-jp/oasst1-21k-en
[ "size_categories:10K<n<100K", "language:en", "license:apache-2.0", "region:us" ]
2024-01-28T23:26:00+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"]}
2024-02-06T04:06:23+00:00
[]
[ "en" ]
TAGS #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
# oasst1-21k-en This repository provides an instruction tuning dataset developed by LLM-jp, a collaborative project launched in Japan. This dataset is an English subset of oasst1. ## Send Questions to llm-jp(at)URL ## Model Card Authors *The names are listed in alphabetical order.* Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto.
[ "# oasst1-21k-en\n\nThis repository provides an instruction tuning dataset developed by LLM-jp, a collaborative project launched in Japan.\n\nThis dataset is an English subset of oasst1.", "## Send Questions to\n\nllm-jp(at)URL", "## Model Card Authors\n*The names are listed in alphabetical order.*\n\nHirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto." ]
[ "TAGS\n#size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n", "# oasst1-21k-en\n\nThis repository provides an instruction tuning dataset developed by LLM-jp, a collaborative project launched in Japan.\n\nThis dataset is an English subset of oasst1.", "## Send Questions to\n\nllm-jp(at)URL", "## Model Card Authors\n*The names are listed in alphabetical order.*\n\nHirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto." ]
f05b5816a8c1ce8c1f5ae3cd87ae5a7b6409fea5
# oasst1-21k-ja This repository provides an instruction tuning dataset developed by [LLM-jp](https://llm-jp.nii.ac.jp/), a collaborative project launched in Japan. This dataset is a Japanese translation of an English subset of [oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1) using DeepL. English subset is [here](https://huggingface.co/datasets/llm-jp/oasst1-21k-en). ## Send Questions to llm-jp(at)nii.ac.jp ## Model Card Authors *The names are listed in alphabetical order.* Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto.
llm-jp/oasst1-21k-ja
[ "size_categories:10K<n<100K", "language:ja", "license:apache-2.0", "region:us" ]
2024-01-28T23:27:03+00:00
{"language": ["ja"], "license": "apache-2.0", "size_categories": ["10K<n<100K"]}
2024-02-06T04:06:04+00:00
[]
[ "ja" ]
TAGS #size_categories-10K<n<100K #language-Japanese #license-apache-2.0 #region-us
# oasst1-21k-ja This repository provides an instruction tuning dataset developed by LLM-jp, a collaborative project launched in Japan. This dataset is a Japanese translation of an English subset of oasst1 using DeepL. English subset is here. ## Send Questions to llm-jp(at)URL ## Model Card Authors *The names are listed in alphabetical order.* Hirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto.
[ "# oasst1-21k-ja\n\nThis repository provides an instruction tuning dataset developed by LLM-jp, a collaborative project launched in Japan.\n\nThis dataset is a Japanese translation of an English subset of oasst1 using DeepL.\n\nEnglish subset is here.", "## Send Questions to\n\nllm-jp(at)URL", "## Model Card Authors\n*The names are listed in alphabetical order.*\n\nHirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto." ]
[ "TAGS\n#size_categories-10K<n<100K #language-Japanese #license-apache-2.0 #region-us \n", "# oasst1-21k-ja\n\nThis repository provides an instruction tuning dataset developed by LLM-jp, a collaborative project launched in Japan.\n\nThis dataset is a Japanese translation of an English subset of oasst1 using DeepL.\n\nEnglish subset is here.", "## Send Questions to\n\nllm-jp(at)URL", "## Model Card Authors\n*The names are listed in alphabetical order.*\n\nHirokazu Kiyomaru, Hiroshi Matsuda, Jun Suzuki, Namgi Han, Saku Sugawara, Shota Sasaki, Shuhei Kurita, Taishi Nakamura, Takashi Kodama, Takumi Okamoto." ]
f88623ddffb64cfb78d88c55d5b94c520a8c56a5
# Dataset Card for "counterfactual-babylm-pipps_removal" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kanishka/counterfactual-babylm-pipps_removal
[ "region:us" ]
2024-01-28T23:33:50+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 581830554, "num_examples": 11632119}, {"name": "validation", "num_bytes": 56120230, "num_examples": 1026747}], "download_size": 421726778, "dataset_size": 637950784}}
2024-01-28T23:34:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for "counterfactual-babylm-pipps_removal" More Information needed
[ "# Dataset Card for \"counterfactual-babylm-pipps_removal\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"counterfactual-babylm-pipps_removal\"\n\nMore Information needed" ]
6b80020098a67816ea34f5afc4451a17b6b8d14e
# Dataset Card for "counterfactual-babylm-keys_to_it_all_removal" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kanishka/counterfactual-babylm-keys_to_it_all_removal
[ "region:us" ]
2024-01-28T23:44:25+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 581838721, "num_examples": 11634224}, {"name": "validation", "num_bytes": 56120230, "num_examples": 1026747}], "download_size": 421689270, "dataset_size": 637958951}}
2024-01-28T23:44:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for "counterfactual-babylm-keys_to_it_all_removal" More Information needed
[ "# Dataset Card for \"counterfactual-babylm-keys_to_it_all_removal\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"counterfactual-babylm-keys_to_it_all_removal\"\n\nMore Information needed" ]
d8288826be4223ea0842424f3d8770a8a2ae440a
# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6-merge <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [fionazhang/mistral-experiment-6-merge](https://huggingface.co/fionazhang/mistral-experiment-6-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fionazhang__mistral-experiment-6-merge", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T00:37:45.719694](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-experiment-6-merge/blob/main/results_2024-01-29T00-37-45.719694.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6273149761941949, "acc_stderr": 0.03276739884249563, "acc_norm": 0.6328964160481401, "acc_norm_stderr": 0.033428707149193194, "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236614, "mc2": 0.4498908490105641, "mc2_stderr": 0.014475253736514098 }, "harness|arc:challenge|25": { "acc": 0.5895904436860068, "acc_stderr": 0.01437492219264266, "acc_norm": 0.6382252559726962, "acc_norm_stderr": 0.014041957945038076 }, "harness|hellaswag|10": { "acc": 0.6499701254730134, "acc_stderr": 0.004760041843651492, "acc_norm": 0.8424616610237005, "acc_norm_stderr": 0.003635630352475905 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5851851851851851, "acc_stderr": 0.04256193767901408, "acc_norm": 0.5851851851851851, "acc_norm_stderr": 0.04256193767901408 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.631578947368421, "acc_stderr": 0.03925523381052932, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.03925523381052932 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5895953757225434, "acc_stderr": 0.03750757044895537, "acc_norm": 0.5895953757225434, "acc_norm_stderr": 0.03750757044895537 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4411764705882353, "acc_stderr": 0.049406356306056595, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.049406356306056595 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5586206896551724, "acc_stderr": 0.04137931034482758, "acc_norm": 0.5586206896551724, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055273, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055273 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7483870967741936, "acc_stderr": 0.02468597928623996, "acc_norm": 0.7483870967741936, "acc_norm_stderr": 0.02468597928623996 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.0351760354036101, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.0351760354036101 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198906, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198906 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.02381447708659356, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.02381447708659356 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6282051282051282, "acc_stderr": 0.024503472557110936, "acc_norm": 0.6282051282051282, "acc_norm_stderr": 0.024503472557110936 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066485, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.32450331125827814, "acc_stderr": 0.03822746937658754, "acc_norm": 0.32450331125827814, "acc_norm_stderr": 0.03822746937658754 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.016722684526200172, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.016722684526200172 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7745098039215687, "acc_stderr": 0.02933116229425174, "acc_norm": 0.7745098039215687, "acc_norm_stderr": 0.02933116229425174 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7383966244725738, "acc_stderr": 0.028609516716994934, "acc_norm": 0.7383966244725738, "acc_norm_stderr": 0.028609516716994934 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.031602951437766785, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.031602951437766785 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7407407407407407, "acc_stderr": 0.04236511258094632, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.04236511258094632 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742178, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742178 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597528, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597528 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7956577266922095, "acc_stderr": 0.014419123980931894, "acc_norm": 0.7956577266922095, "acc_norm_stderr": 0.014419123980931894 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6878612716763006, "acc_stderr": 0.024946792225272314, "acc_norm": 0.6878612716763006, "acc_norm_stderr": 0.024946792225272314 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38212290502793295, "acc_stderr": 0.01625113971157077, "acc_norm": 0.38212290502793295, "acc_norm_stderr": 0.01625113971157077 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729487, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729487 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.025583062489984824, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.025583062489984824 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7160493827160493, "acc_stderr": 0.025089478523765134, "acc_norm": 0.7160493827160493, "acc_norm_stderr": 0.025089478523765134 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44328552803129073, "acc_stderr": 0.012687818419599924, "acc_norm": 0.44328552803129073, "acc_norm_stderr": 0.012687818419599924 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6764705882352942, "acc_stderr": 0.028418208619406755, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.028418208619406755 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6470588235294118, "acc_stderr": 0.01933314202079716, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.01933314202079716 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302506, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302506 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.02927956741106568, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.02927956741106568 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786848, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786848 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774711, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774711 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236614, "mc2": 0.4498908490105641, "mc2_stderr": 0.014475253736514098 }, "harness|winogrande|5": { "acc": 0.7797947908445146, "acc_stderr": 0.011646276755089688 }, "harness|gsm8k|5": { "acc": 0.3866565579984837, "acc_stderr": 0.013413955095965303 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_fionazhang__mistral-experiment-6-merge
[ "region:us" ]
2024-01-29T00:40:05+00:00
{"pretty_name": "Evaluation run of fionazhang/mistral-experiment-6-merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [fionazhang/mistral-experiment-6-merge](https://huggingface.co/fionazhang/mistral-experiment-6-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fionazhang__mistral-experiment-6-merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T00:37:45.719694](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-experiment-6-merge/blob/main/results_2024-01-29T00-37-45.719694.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6273149761941949,\n \"acc_stderr\": 0.03276739884249563,\n \"acc_norm\": 0.6328964160481401,\n \"acc_norm_stderr\": 0.033428707149193194,\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4498908490105641,\n \"mc2_stderr\": 0.014475253736514098\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.01437492219264266,\n \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038076\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6499701254730134,\n \"acc_stderr\": 0.004760041843651492,\n \"acc_norm\": 0.8424616610237005,\n \"acc_norm_stderr\": 0.003635630352475905\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.02468597928623996,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.02468597928623996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659356,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659356\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200172,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n \"acc_stderr\": 0.014419123980931894,\n \"acc_norm\": 0.7956577266922095,\n \"acc_norm_stderr\": 0.014419123980931894\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.01625113971157077,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.01625113971157077\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729487,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729487\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4498908490105641,\n \"mc2_stderr\": 0.014475253736514098\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3866565579984837,\n \"acc_stderr\": 0.013413955095965303\n }\n}\n```", "repo_url": "https://huggingface.co/fionazhang/mistral-experiment-6-merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|arc:challenge|25_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|gsm8k|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hellaswag|10_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T00-37-45.719694.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["**/details_harness|winogrande|5_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T00-37-45.719694.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T00_37_45.719694", "path": ["results_2024-01-29T00-37-45.719694.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T00-37-45.719694.parquet"]}]}]}
2024-01-29T00:40:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6-merge Dataset automatically created during the evaluation run of model fionazhang/mistral-experiment-6-merge on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T00:37:45.719694(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6-merge\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/mistral-experiment-6-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T00:37:45.719694(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6-merge\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/mistral-experiment-6-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T00:37:45.719694(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7faf22dbfbd63819fe1b0695ba7f0cd093322d52
# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [fionazhang/mistral-experiment-6](https://huggingface.co/fionazhang/mistral-experiment-6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fionazhang__mistral-experiment-6", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T00:42:51.247635](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-experiment-6/blob/main/results_2024-01-29T00-42-51.247635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5538123362399926, "acc_stderr": 0.033998687888343836, "acc_norm": 0.5600824142135805, "acc_norm_stderr": 0.034730108194204, "mc1": 0.29865361077111385, "mc1_stderr": 0.016021570613768542, "mc2": 0.4568589633964796, "mc2_stderr": 0.01480166536535197 }, "harness|arc:challenge|25": { "acc": 0.5273037542662116, "acc_stderr": 0.014589589101985996, "acc_norm": 0.5580204778156996, "acc_norm_stderr": 0.014512682523128345 }, "harness|hellaswag|10": { "acc": 0.6227843059151563, "acc_stderr": 0.0048369903732615694, "acc_norm": 0.814479187412866, "acc_norm_stderr": 0.003879250555254521 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.42962962962962964, "acc_stderr": 0.04276349494376599, "acc_norm": 0.42962962962962964, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5921052631578947, "acc_stderr": 0.039993097127774734, "acc_norm": 0.5921052631578947, "acc_norm_stderr": 0.039993097127774734 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6150943396226415, "acc_stderr": 0.02994649856769995, "acc_norm": 0.6150943396226415, "acc_norm_stderr": 0.02994649856769995 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5902777777777778, "acc_stderr": 0.04112490974670787, "acc_norm": 0.5902777777777778, "acc_norm_stderr": 0.04112490974670787 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201942, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201942 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5191489361702127, "acc_stderr": 0.032662042990646796, "acc_norm": 0.5191489361702127, "acc_norm_stderr": 0.032662042990646796 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.39473684210526316, "acc_stderr": 0.045981880578165414, "acc_norm": 0.39473684210526316, "acc_norm_stderr": 0.045981880578165414 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4827586206896552, "acc_stderr": 0.04164188720169377, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.04164188720169377 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.025253032554997692, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.025253032554997692 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.041905964388711366, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.041905964388711366 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6935483870967742, "acc_stderr": 0.026226485652553883, "acc_norm": 0.6935483870967742, "acc_norm_stderr": 0.026226485652553883 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3842364532019704, "acc_stderr": 0.0342239856565755, "acc_norm": 0.3842364532019704, "acc_norm_stderr": 0.0342239856565755 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6121212121212121, "acc_stderr": 0.03804913653971012, "acc_norm": 0.6121212121212121, "acc_norm_stderr": 0.03804913653971012 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7323232323232324, "acc_stderr": 0.03154449888270285, "acc_norm": 0.7323232323232324, "acc_norm_stderr": 0.03154449888270285 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7875647668393783, "acc_stderr": 0.029519282616817216, "acc_norm": 0.7875647668393783, "acc_norm_stderr": 0.029519282616817216 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5230769230769231, "acc_stderr": 0.025323990861736232, "acc_norm": 0.5230769230769231, "acc_norm_stderr": 0.025323990861736232 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35555555555555557, "acc_stderr": 0.0291857149498574, "acc_norm": 0.35555555555555557, "acc_norm_stderr": 0.0291857149498574 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5336134453781513, "acc_stderr": 0.03240501447690071, "acc_norm": 0.5336134453781513, "acc_norm_stderr": 0.03240501447690071 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7504587155963303, "acc_stderr": 0.018553897629501628, "acc_norm": 0.7504587155963303, "acc_norm_stderr": 0.018553897629501628 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.03400603625538271, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.03400603625538271 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6764705882352942, "acc_stderr": 0.032834720561085606, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.032834720561085606 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.679324894514768, "acc_stderr": 0.030381931949990407, "acc_norm": 0.679324894514768, "acc_norm_stderr": 0.030381931949990407 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5560538116591929, "acc_stderr": 0.03334625674242728, "acc_norm": 0.5560538116591929, "acc_norm_stderr": 0.03334625674242728 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.045218299028335865, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.045218299028335865 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.044532548363264673, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.044532548363264673 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7905982905982906, "acc_stderr": 0.026655699653922737, "acc_norm": 0.7905982905982906, "acc_norm_stderr": 0.026655699653922737 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7343550446998723, "acc_stderr": 0.01579430248788872, "acc_norm": 0.7343550446998723, "acc_norm_stderr": 0.01579430248788872 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6213872832369942, "acc_stderr": 0.02611374936131034, "acc_norm": 0.6213872832369942, "acc_norm_stderr": 0.02611374936131034 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.31843575418994413, "acc_stderr": 0.015581008080360276, "acc_norm": 0.31843575418994413, "acc_norm_stderr": 0.015581008080360276 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6339869281045751, "acc_stderr": 0.027582811415159617, "acc_norm": 0.6339869281045751, "acc_norm_stderr": 0.027582811415159617 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6559485530546624, "acc_stderr": 0.026981478043648043, "acc_norm": 0.6559485530546624, "acc_norm_stderr": 0.026981478043648043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6265432098765432, "acc_stderr": 0.026915003011380154, "acc_norm": 0.6265432098765432, "acc_norm_stderr": 0.026915003011380154 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4326241134751773, "acc_stderr": 0.029555454236778855, "acc_norm": 0.4326241134751773, "acc_norm_stderr": 0.029555454236778855 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4041720990873533, "acc_stderr": 0.01253350404649136, "acc_norm": 0.4041720990873533, "acc_norm_stderr": 0.01253350404649136 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.02976826352893311, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.02976826352893311 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5473856209150327, "acc_stderr": 0.02013679091849253, "acc_norm": 0.5473856209150327, "acc_norm_stderr": 0.02013679091849253 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5755102040816327, "acc_stderr": 0.031642094879429414, "acc_norm": 0.5755102040816327, "acc_norm_stderr": 0.031642094879429414 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7711442786069652, "acc_stderr": 0.029705284056772436, "acc_norm": 0.7711442786069652, "acc_norm_stderr": 0.029705284056772436 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.038913644958358175, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.038913644958358175 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.032744852119469564, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.032744852119469564 }, "harness|truthfulqa:mc|0": { "mc1": 0.29865361077111385, "mc1_stderr": 0.016021570613768542, "mc2": 0.4568589633964796, "mc2_stderr": 0.01480166536535197 }, "harness|winogrande|5": { "acc": 0.7379636937647988, "acc_stderr": 0.012358944431637563 }, "harness|gsm8k|5": { "acc": 0.2221379833206975, "acc_stderr": 0.011449986902435321 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_fionazhang__mistral-experiment-6
[ "region:us" ]
2024-01-29T00:45:11+00:00
{"pretty_name": "Evaluation run of fionazhang/mistral-experiment-6", "dataset_summary": "Dataset automatically created during the evaluation run of model [fionazhang/mistral-experiment-6](https://huggingface.co/fionazhang/mistral-experiment-6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fionazhang__mistral-experiment-6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T00:42:51.247635](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-experiment-6/blob/main/results_2024-01-29T00-42-51.247635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5538123362399926,\n \"acc_stderr\": 0.033998687888343836,\n \"acc_norm\": 0.5600824142135805,\n \"acc_norm_stderr\": 0.034730108194204,\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4568589633964796,\n \"mc2_stderr\": 0.01480166536535197\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985996,\n \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128345\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6227843059151563,\n \"acc_stderr\": 0.0048369903732615694,\n \"acc_norm\": 0.814479187412866,\n \"acc_norm_stderr\": 0.003879250555254521\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6935483870967742,\n \"acc_stderr\": 0.026226485652553883,\n \"acc_norm\": 0.6935483870967742,\n \"acc_norm_stderr\": 0.026226485652553883\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.03804913653971012,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.03804913653971012\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817216,\n \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817216\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736232,\n \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736232\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.0291857149498574,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.0291857149498574\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5336134453781513,\n \"acc_stderr\": 0.03240501447690071,\n \"acc_norm\": 0.5336134453781513,\n \"acc_norm_stderr\": 0.03240501447690071\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.679324894514768,\n \"acc_stderr\": 0.030381931949990407,\n \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.030381931949990407\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7343550446998723,\n \"acc_stderr\": 0.01579430248788872,\n \"acc_norm\": 0.7343550446998723,\n \"acc_norm_stderr\": 0.01579430248788872\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.02611374936131034,\n \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.02611374936131034\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159617,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159617\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n \"acc_stderr\": 0.026981478043648043,\n \"acc_norm\": 0.6559485530546624,\n \"acc_norm_stderr\": 0.026981478043648043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.026915003011380154,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.026915003011380154\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4041720990873533,\n \"acc_stderr\": 0.01253350404649136,\n \"acc_norm\": 0.4041720990873533,\n \"acc_norm_stderr\": 0.01253350404649136\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.02976826352893311,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.02976826352893311\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5473856209150327,\n \"acc_stderr\": 0.02013679091849253,\n \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.02013679091849253\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.031642094879429414,\n \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.031642094879429414\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.038913644958358175,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.038913644958358175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4568589633964796,\n \"mc2_stderr\": 0.01480166536535197\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637563\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2221379833206975,\n \"acc_stderr\": 0.011449986902435321\n }\n}\n```", "repo_url": "https://huggingface.co/fionazhang/mistral-experiment-6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|arc:challenge|25_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|gsm8k|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hellaswag|10_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T00-42-51.247635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["**/details_harness|winogrande|5_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T00-42-51.247635.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T00_42_51.247635", "path": ["results_2024-01-29T00-42-51.247635.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T00-42-51.247635.parquet"]}]}]}
2024-01-29T00:45:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6 Dataset automatically created during the evaluation run of model fionazhang/mistral-experiment-6 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T00:42:51.247635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/mistral-experiment-6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T00:42:51.247635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fionazhang/mistral-experiment-6\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/mistral-experiment-6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T00:42:51.247635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
dcd2f5f67cc2e957dcfd95b3e3b08cf4dea0d079
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2](https://huggingface.co/Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-Cinder-1.3B-Reason-Test.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T01:07:57.572756](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-Cinder-1.3B-Reason-Test.2/blob/main/results_2024-01-29T01-07-57.572756.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.251327576424007, "acc_stderr": 0.030372163539921712, "acc_norm": 0.251062888632938, "acc_norm_stderr": 0.03108831080407431, "mc1": 0.24357405140758873, "mc1_stderr": 0.015026354824910782, "mc2": 0.3899721945335931, "mc2_stderr": 0.014222197893576758 }, "harness|arc:challenge|25": { "acc": 0.2977815699658703, "acc_stderr": 0.013363080107244487, "acc_norm": 0.32764505119453924, "acc_norm_stderr": 0.013715847940719346 }, "harness|hellaswag|10": { "acc": 0.4402509460266879, "acc_stderr": 0.004954026775425775, "acc_norm": 0.5826528579964151, "acc_norm_stderr": 0.00492113386493189 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.21481481481481482, "acc_stderr": 0.035478541985608236, "acc_norm": 0.21481481481481482, "acc_norm_stderr": 0.035478541985608236 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2490566037735849, "acc_stderr": 0.02661648298050171, "acc_norm": 0.2490566037735849, "acc_norm_stderr": 0.02661648298050171 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.22916666666666666, "acc_stderr": 0.035146974678623884, "acc_norm": 0.22916666666666666, "acc_norm_stderr": 0.035146974678623884 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.042295258468165085, "acc_norm": 0.23, "acc_norm_stderr": 0.042295258468165085 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.2, "acc_stderr": 0.04020151261036846, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.22, "acc_stderr": 0.04163331998932269, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932269 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.21965317919075145, "acc_stderr": 0.031568093627031744, "acc_norm": 0.21965317919075145, "acc_norm_stderr": 0.031568093627031744 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.1568627450980392, "acc_stderr": 0.036186648199362466, "acc_norm": 0.1568627450980392, "acc_norm_stderr": 0.036186648199362466 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.19, "acc_stderr": 0.039427724440366234, "acc_norm": 0.19, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.25957446808510637, "acc_stderr": 0.02865917937429232, "acc_norm": 0.25957446808510637, "acc_norm_stderr": 0.02865917937429232 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2631578947368421, "acc_stderr": 0.0414243971948936, "acc_norm": 0.2631578947368421, "acc_norm_stderr": 0.0414243971948936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.25517241379310346, "acc_stderr": 0.03632984052707842, "acc_norm": 0.25517241379310346, "acc_norm_stderr": 0.03632984052707842 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25925925925925924, "acc_stderr": 0.022569897074918417, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.022569897074918417 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2698412698412698, "acc_stderr": 0.039701582732351734, "acc_norm": 0.2698412698412698, "acc_norm_stderr": 0.039701582732351734 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.14, "acc_stderr": 0.0348735088019777, "acc_norm": 0.14, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.2870967741935484, "acc_stderr": 0.025736542745594525, "acc_norm": 0.2870967741935484, "acc_norm_stderr": 0.025736542745594525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.270935960591133, "acc_stderr": 0.031270907132977, "acc_norm": 0.270935960591133, "acc_norm_stderr": 0.031270907132977 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.23, "acc_stderr": 0.04229525846816505, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.28484848484848485, "acc_stderr": 0.035243908445117836, "acc_norm": 0.28484848484848485, "acc_norm_stderr": 0.035243908445117836 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.18181818181818182, "acc_stderr": 0.02747960301053878, "acc_norm": 0.18181818181818182, "acc_norm_stderr": 0.02747960301053878 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.23316062176165803, "acc_stderr": 0.03051611137147601, "acc_norm": 0.23316062176165803, "acc_norm_stderr": 0.03051611137147601 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2846153846153846, "acc_stderr": 0.022878322799706283, "acc_norm": 0.2846153846153846, "acc_norm_stderr": 0.022878322799706283 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275784, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275784 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21428571428571427, "acc_stderr": 0.02665353159671548, "acc_norm": 0.21428571428571427, "acc_norm_stderr": 0.02665353159671548 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2052980132450331, "acc_stderr": 0.03297986648473834, "acc_norm": 0.2052980132450331, "acc_norm_stderr": 0.03297986648473834 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.22385321100917432, "acc_stderr": 0.01787121776779022, "acc_norm": 0.22385321100917432, "acc_norm_stderr": 0.01787121776779022 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.033509916046960436, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.033509916046960436 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.030190282453501954, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.030190282453501954 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.29535864978902954, "acc_stderr": 0.02969633871342288, "acc_norm": 0.29535864978902954, "acc_norm_stderr": 0.02969633871342288 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.35874439461883406, "acc_stderr": 0.032190792004199956, "acc_norm": 0.35874439461883406, "acc_norm_stderr": 0.032190792004199956 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.20610687022900764, "acc_stderr": 0.03547771004159463, "acc_norm": 0.20610687022900764, "acc_norm_stderr": 0.03547771004159463 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.16666666666666666, "acc_stderr": 0.036028141763926456, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.036028141763926456 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.26993865030674846, "acc_stderr": 0.03487825168497892, "acc_norm": 0.26993865030674846, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.04007341809755805, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.04007341809755805 }, "harness|hendrycksTest-management|5": { "acc": 0.2524271844660194, "acc_stderr": 0.04301250399690877, "acc_norm": 0.2524271844660194, "acc_norm_stderr": 0.04301250399690877 }, "harness|hendrycksTest-marketing|5": { "acc": 0.23931623931623933, "acc_stderr": 0.02795182680892433, "acc_norm": 0.23931623931623933, "acc_norm_stderr": 0.02795182680892433 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2784163473818646, "acc_stderr": 0.016028295188992462, "acc_norm": 0.2784163473818646, "acc_norm_stderr": 0.016028295188992462 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2514450867052023, "acc_stderr": 0.023357365785874044, "acc_norm": 0.2514450867052023, "acc_norm_stderr": 0.023357365785874044 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.24183006535947713, "acc_stderr": 0.024518195641879334, "acc_norm": 0.24183006535947713, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.2090032154340836, "acc_stderr": 0.02309314039837422, "acc_norm": 0.2090032154340836, "acc_norm_stderr": 0.02309314039837422 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.24691358024691357, "acc_stderr": 0.023993501709042107, "acc_norm": 0.24691358024691357, "acc_norm_stderr": 0.023993501709042107 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.2198581560283688, "acc_stderr": 0.024706141070705484, "acc_norm": 0.2198581560283688, "acc_norm_stderr": 0.024706141070705484 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.23272490221642764, "acc_stderr": 0.0107925955538885, "acc_norm": 0.23272490221642764, "acc_norm_stderr": 0.0107925955538885 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.22426470588235295, "acc_stderr": 0.025336848563332355, "acc_norm": 0.22426470588235295, "acc_norm_stderr": 0.025336848563332355 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2434640522875817, "acc_stderr": 0.017362473762146634, "acc_norm": 0.2434640522875817, "acc_norm_stderr": 0.017362473762146634 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.2727272727272727, "acc_stderr": 0.04265792110940589, "acc_norm": 0.2727272727272727, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.1673469387755102, "acc_stderr": 0.023897144768914524, "acc_norm": 0.1673469387755102, "acc_norm_stderr": 0.023897144768914524 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2537313432835821, "acc_stderr": 0.030769444967296018, "acc_norm": 0.2537313432835821, "acc_norm_stderr": 0.030769444967296018 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-virology|5": { "acc": 0.25903614457831325, "acc_stderr": 0.03410646614071856, "acc_norm": 0.25903614457831325, "acc_norm_stderr": 0.03410646614071856 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.30994152046783624, "acc_stderr": 0.035469769593931624, "acc_norm": 0.30994152046783624, "acc_norm_stderr": 0.035469769593931624 }, "harness|truthfulqa:mc|0": { "mc1": 0.24357405140758873, "mc1_stderr": 0.015026354824910782, "mc2": 0.3899721945335931, "mc2_stderr": 0.014222197893576758 }, "harness|winogrande|5": { "acc": 0.6503551696921863, "acc_stderr": 0.013402073680850503 }, "harness|gsm8k|5": { "acc": 0.0401819560272934, "acc_stderr": 0.005409439736970487 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Josephgflowers__Tinyllama-Cinder-1.3B-Reason-Test.2
[ "region:us" ]
2024-01-29T01:09:47+00:00
{"pretty_name": "Evaluation run of Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2](https://huggingface.co/Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-Cinder-1.3B-Reason-Test.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T01:07:57.572756](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-Cinder-1.3B-Reason-Test.2/blob/main/results_2024-01-29T01-07-57.572756.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.251327576424007,\n \"acc_stderr\": 0.030372163539921712,\n \"acc_norm\": 0.251062888632938,\n \"acc_norm_stderr\": 0.03108831080407431,\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.3899721945335931,\n \"mc2_stderr\": 0.014222197893576758\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2977815699658703,\n \"acc_stderr\": 0.013363080107244487,\n \"acc_norm\": 0.32764505119453924,\n \"acc_norm_stderr\": 0.013715847940719346\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4402509460266879,\n \"acc_stderr\": 0.004954026775425775,\n \"acc_norm\": 0.5826528579964151,\n \"acc_norm_stderr\": 0.00492113386493189\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.035478541985608236,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.035478541985608236\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.02661648298050171,\n \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.02661648298050171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165085,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165085\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362466,\n \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362466\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.039701582732351734,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.039701582732351734\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2870967741935484,\n \"acc_stderr\": 0.025736542745594525,\n \"acc_norm\": 0.2870967741935484,\n \"acc_norm_stderr\": 0.025736542745594525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132977,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132977\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.02747960301053878,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.02747960301053878\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2846153846153846,\n \"acc_stderr\": 0.022878322799706283,\n \"acc_norm\": 0.2846153846153846,\n \"acc_norm_stderr\": 0.022878322799706283\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275784,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275784\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22385321100917432,\n \"acc_stderr\": 0.01787121776779022,\n \"acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.01787121776779022\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960436,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960436\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501954,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501954\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.29535864978902954,\n \"acc_stderr\": 0.02969633871342288,\n \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.02969633871342288\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n \"acc_stderr\": 0.02795182680892433,\n \"acc_norm\": 0.23931623931623933,\n \"acc_norm_stderr\": 0.02795182680892433\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2784163473818646,\n \"acc_stderr\": 0.016028295188992462,\n \"acc_norm\": 0.2784163473818646,\n \"acc_norm_stderr\": 0.016028295188992462\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874044,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874044\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2090032154340836,\n \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.2090032154340836,\n \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705484,\n \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705484\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23272490221642764,\n \"acc_stderr\": 0.0107925955538885,\n \"acc_norm\": 0.23272490221642764,\n \"acc_norm_stderr\": 0.0107925955538885\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22426470588235295,\n \"acc_stderr\": 0.025336848563332355,\n \"acc_norm\": 0.22426470588235295,\n \"acc_norm_stderr\": 0.025336848563332355\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146634,\n \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146634\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.03410646614071856,\n \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.03410646614071856\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.035469769593931624,\n \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.035469769593931624\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.3899721945335931,\n \"mc2_stderr\": 0.014222197893576758\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6503551696921863,\n \"acc_stderr\": 0.013402073680850503\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \"acc_stderr\": 0.005409439736970487\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|arc:challenge|25_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|gsm8k|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hellaswag|10_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T01-07-57.572756.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["**/details_harness|winogrande|5_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T01-07-57.572756.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T01_07_57.572756", "path": ["results_2024-01-29T01-07-57.572756.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T01-07-57.572756.parquet"]}]}]}
2024-01-29T01:10:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2 Dataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T01:07:57.572756(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T01:07:57.572756(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T01:07:57.572756(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f8a0c086979de3b5849ff1f96c1f46bb258447ce
# Dataset Card for "iemocap_unit" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/iemocap_unit
[ "region:us" ]
2024-01-29T01:46:50+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k_12bps", "path": "data/encodec_24k_12bps-*"}, {"split": "encodec_24k_1_5bps", "path": "data/encodec_24k_1_5bps-*"}, {"split": "encodec_24k_24bps", "path": "data/encodec_24k_24bps-*"}, {"split": "encodec_24k_3bps", "path": "data/encodec_24k_3bps-*"}, {"split": "encodec_24k_6bps", "path": "data/encodec_24k_6bps-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "unit", "sequence": {"sequence": "int64"}}], "splits": [{"name": "academicodec_hifi_16k_320d", "num_bytes": 40413131, "num_examples": 5531}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 40413131, "num_examples": 5531}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 60540299, "num_examples": 5531}, {"name": "audiodec_24k_320d", "num_bytes": 129312379, "num_examples": 5531}, {"name": "dac_16k", "num_bytes": 147841611, "num_examples": 5531}, {"name": "dac_24k", "num_bytes": 590078427, "num_examples": 5531}, {"name": "dac_44k", "num_bytes": 190678311, "num_examples": 5531}, {"name": "encodec_24k_12bps", "num_bytes": 242357531, "num_examples": 5531}, {"name": "encodec_24k_1_5bps", "num_bytes": 30432307, "num_examples": 5531}, {"name": "encodec_24k_24bps", "num_bytes": 484557787, "num_examples": 5531}, {"name": "encodec_24k_3bps", "num_bytes": 60707339, "num_examples": 5531}, {"name": "encodec_24k_6bps", "num_bytes": 121257403, "num_examples": 5531}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 323620059, "num_examples": 5531}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 323620059, "num_examples": 5531}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 323437275, "num_examples": 5531}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 162512347, "num_examples": 5531}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 323437275, "num_examples": 5531}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 162512347, "num_examples": 5531}, {"name": "speech_tokenizer_16k", "num_bytes": 80977275, "num_examples": 5531}], "download_size": 585888386, "dataset_size": 3838706293}}
2024-01-29T01:48:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for "iemocap_unit" More Information needed
[ "# Dataset Card for \"iemocap_unit\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"iemocap_unit\"\n\nMore Information needed" ]
eeaf8384cc6a4a9b18c3c247bcc5d3c70a0cea38
# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-environment-merge <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [fionazhang/fine-tune-mistral-environment-merge](https://huggingface.co/fionazhang/fine-tune-mistral-environment-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T01:47:21.122290](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge/blob/main/results_2024-01-29T01-47-21.122290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6354317163514616, "acc_stderr": 0.032307072675482454, "acc_norm": 0.6419311935208026, "acc_norm_stderr": 0.03296064982960984, "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.4397408572877062, "mc2_stderr": 0.014194431681893268 }, "harness|arc:challenge|25": { "acc": 0.5725255972696246, "acc_stderr": 0.014456862944650649, "acc_norm": 0.6262798634812287, "acc_norm_stderr": 0.014137708601759086 }, "harness|hellaswag|10": { "acc": 0.635929097789285, "acc_stderr": 0.00480185288132974, "acc_norm": 0.8365863373829915, "acc_norm_stderr": 0.0036898701424130753 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6447368421052632, "acc_stderr": 0.038947344870133176, "acc_norm": 0.6447368421052632, "acc_norm_stderr": 0.038947344870133176 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6830188679245283, "acc_stderr": 0.028637235639800893, "acc_norm": 0.6830188679245283, "acc_norm_stderr": 0.028637235639800893 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.04702880432049615, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3783068783068783, "acc_stderr": 0.024976954053155254, "acc_norm": 0.3783068783068783, "acc_norm_stderr": 0.024976954053155254 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.043902592653775614, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.043902592653775614 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7548387096774194, "acc_stderr": 0.024472243840895525, "acc_norm": 0.7548387096774194, "acc_norm_stderr": 0.024472243840895525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.03515895551165698, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.03515895551165698 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.03374402644139403, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.03374402644139403 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.02338193534812142, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.02338193534812142 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6410256410256411, "acc_stderr": 0.02432173848460235, "acc_norm": 0.6410256410256411, "acc_norm_stderr": 0.02432173848460235 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3592592592592593, "acc_stderr": 0.029252905927251976, "acc_norm": 0.3592592592592593, "acc_norm_stderr": 0.029252905927251976 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6512605042016807, "acc_stderr": 0.030956636328566548, "acc_norm": 0.6512605042016807, "acc_norm_stderr": 0.030956636328566548 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8201834862385321, "acc_stderr": 0.01646534546739154, "acc_norm": 0.8201834862385321, "acc_norm_stderr": 0.01646534546739154 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5416666666666666, "acc_stderr": 0.033981108902946366, "acc_norm": 0.5416666666666666, "acc_norm_stderr": 0.033981108902946366 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7843137254901961, "acc_stderr": 0.02886743144984932, "acc_norm": 0.7843137254901961, "acc_norm_stderr": 0.02886743144984932 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467617, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467617 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.03114679648297246, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.03114679648297246 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.034465133507525975, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.034465133507525975 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8016528925619835, "acc_stderr": 0.036401182719909456, "acc_norm": 0.8016528925619835, "acc_norm_stderr": 0.036401182719909456 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.8058252427184466, "acc_stderr": 0.03916667762822585, "acc_norm": 0.8058252427184466, "acc_norm_stderr": 0.03916667762822585 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8071519795657727, "acc_stderr": 0.014108533515757431, "acc_norm": 0.8071519795657727, "acc_norm_stderr": 0.014108533515757431 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.024182427496577615, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.024182427496577615 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.36089385474860336, "acc_stderr": 0.016062290671110462, "acc_norm": 0.36089385474860336, "acc_norm_stderr": 0.016062290671110462 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.0248480182638752, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.0248480182638752 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632938, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632938 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7191358024691358, "acc_stderr": 0.02500646975579921, "acc_norm": 0.7191358024691358, "acc_norm_stderr": 0.02500646975579921 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4471968709256845, "acc_stderr": 0.012698825252435108, "acc_norm": 0.4471968709256845, "acc_norm_stderr": 0.012698825252435108 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462923, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462923 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6650326797385621, "acc_stderr": 0.019094228167000314, "acc_norm": 0.6650326797385621, "acc_norm_stderr": 0.019094228167000314 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.028795185574291296, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.028795185574291296 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8109452736318408, "acc_stderr": 0.02768691358801302, "acc_norm": 0.8109452736318408, "acc_norm_stderr": 0.02768691358801302 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.29008567931456547, "mc1_stderr": 0.01588623687420952, "mc2": 0.4397408572877062, "mc2_stderr": 0.014194431681893268 }, "harness|winogrande|5": { "acc": 0.7892659826361483, "acc_stderr": 0.011462046419710681 }, "harness|gsm8k|5": { "acc": 0.3525398028809704, "acc_stderr": 0.013159909755930323 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge
[ "region:us" ]
2024-01-29T01:49:47+00:00
{"pretty_name": "Evaluation run of fionazhang/fine-tune-mistral-environment-merge", "dataset_summary": "Dataset automatically created during the evaluation run of model [fionazhang/fine-tune-mistral-environment-merge](https://huggingface.co/fionazhang/fine-tune-mistral-environment-merge) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T01:47:21.122290](https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge/blob/main/results_2024-01-29T01-47-21.122290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6354317163514616,\n \"acc_stderr\": 0.032307072675482454,\n \"acc_norm\": 0.6419311935208026,\n \"acc_norm_stderr\": 0.03296064982960984,\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4397408572877062,\n \"mc2_stderr\": 0.014194431681893268\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759086\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.635929097789285,\n \"acc_stderr\": 0.00480185288132974,\n \"acc_norm\": 0.8365863373829915,\n \"acc_norm_stderr\": 0.0036898701424130753\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739154,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739154\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n \"acc_stderr\": 0.016062290671110462,\n \"acc_norm\": 0.36089385474860336,\n \"acc_norm_stderr\": 0.016062290671110462\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632938,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632938\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n \"acc_stderr\": 0.012698825252435108,\n \"acc_norm\": 0.4471968709256845,\n \"acc_norm_stderr\": 0.012698825252435108\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000314,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000314\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4397408572877062,\n \"mc2_stderr\": 0.014194431681893268\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710681\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3525398028809704,\n \"acc_stderr\": 0.013159909755930323\n }\n}\n```", "repo_url": "https://huggingface.co/fionazhang/fine-tune-mistral-environment-merge", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|arc:challenge|25_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|gsm8k|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hellaswag|10_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["**/details_harness|winogrande|5_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T01-47-21.122290.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T01_47_21.122290", "path": ["results_2024-01-29T01-47-21.122290.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T01-47-21.122290.parquet"]}]}]}
2024-01-29T01:50:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-environment-merge Dataset automatically created during the evaluation run of model fionazhang/fine-tune-mistral-environment-merge on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T01:47:21.122290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-environment-merge\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/fine-tune-mistral-environment-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T01:47:21.122290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of fionazhang/fine-tune-mistral-environment-merge\n\n\n\nDataset automatically created during the evaluation run of model fionazhang/fine-tune-mistral-environment-merge on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T01:47:21.122290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5af3327548c9e4bb9123ceb8ce6ff76e2799c452
# Home Assistant Requests Dataset This dataset contains a list of requests and responses for a user interacting with a personal assistant that controls an instance of [Home Assistant](https://www.home-assistant.io/). The dataset is generated from the different CSV "piles". The "piles" contain different chunks of requests that are assembled into a final context that is presented to the LLM. For example, `piles/pile_of_device_names.csv` contains only names of various devices to be used as part of context as well as inserted into `piles/pile_of_templated_actions.csv` and `piles/pile_of_status_requests.csv`. The logic for assembling the final dataset from the piles is contained in [generate_home_assistant_data.py](./generate_home_assistant_data.py). ## Generating the dataset from piles `python3 generate_home_assistant_data.py --train --test --large` Supported dataset splits are `--test`, `--train`, & `--sample` Arguments to set the train dataset size are `--small`, `--medium`, `--large`, & `--xl`. ## Merging with other instruct-datasets for training `python3 generate_home_assistant_data.py --merge <dataset>` Supported datasets right now are: - `alpaca` - `wizardlm70k` Please note that the supported datasets all have different licenses. Be aware that the license of the resulting data mixture might be different that the license of this dataset alone.
acon96/Home-Assistant-Requests
[ "task_categories:question-answering", "task_categories:text-generation", "size_categories:10K<n<100k", "language:en", "license:mit", "automation", "home", "assistant", "region:us" ]
2024-01-29T01:58:04+00:00
{"language": ["en"], "license": "mit", "size_categories": ["10K<n<100k"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "Home Assistant Requests", "tags": ["automation", "home", "assistant"]}
2024-01-29T01:58:14+00:00
[]
[ "en" ]
TAGS #task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100k #language-English #license-mit #automation #home #assistant #region-us
# Home Assistant Requests Dataset This dataset contains a list of requests and responses for a user interacting with a personal assistant that controls an instance of Home Assistant. The dataset is generated from the different CSV "piles". The "piles" contain different chunks of requests that are assembled into a final context that is presented to the LLM. For example, 'piles/pile_of_device_names.csv' contains only names of various devices to be used as part of context as well as inserted into 'piles/pile_of_templated_actions.csv' and 'piles/pile_of_status_requests.csv'. The logic for assembling the final dataset from the piles is contained in generate_home_assistant_data.py. ## Generating the dataset from piles 'python3 generate_home_assistant_data.py --train --test --large' Supported dataset splits are '--test', '--train', & '--sample' Arguments to set the train dataset size are '--small', '--medium', '--large', & '--xl'. ## Merging with other instruct-datasets for training 'python3 generate_home_assistant_data.py --merge <dataset>' Supported datasets right now are: - 'alpaca' - 'wizardlm70k' Please note that the supported datasets all have different licenses. Be aware that the license of the resulting data mixture might be different that the license of this dataset alone.
[ "# Home Assistant Requests Dataset\n\nThis dataset contains a list of requests and responses for a user interacting with a personal assistant that controls an instance of Home Assistant.\n\nThe dataset is generated from the different CSV \"piles\". The \"piles\" contain different chunks of requests that are assembled into a final context that is presented to the LLM. For example, 'piles/pile_of_device_names.csv' contains only names of various devices to be used as part of context as well as inserted into 'piles/pile_of_templated_actions.csv' and 'piles/pile_of_status_requests.csv'. The logic for assembling the final dataset from the piles is contained in generate_home_assistant_data.py.", "## Generating the dataset from piles\n\n'python3 generate_home_assistant_data.py --train --test --large'\n\nSupported dataset splits are '--test', '--train', & '--sample'\nArguments to set the train dataset size are '--small', '--medium', '--large', & '--xl'.", "## Merging with other instruct-datasets for training\n\n'python3 generate_home_assistant_data.py --merge <dataset>'\n\nSupported datasets right now are: \n- 'alpaca'\n- 'wizardlm70k'\n\nPlease note that the supported datasets all have different licenses. Be aware that the license of the resulting data mixture might be different that the license of this dataset alone." ]
[ "TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100k #language-English #license-mit #automation #home #assistant #region-us \n", "# Home Assistant Requests Dataset\n\nThis dataset contains a list of requests and responses for a user interacting with a personal assistant that controls an instance of Home Assistant.\n\nThe dataset is generated from the different CSV \"piles\". The \"piles\" contain different chunks of requests that are assembled into a final context that is presented to the LLM. For example, 'piles/pile_of_device_names.csv' contains only names of various devices to be used as part of context as well as inserted into 'piles/pile_of_templated_actions.csv' and 'piles/pile_of_status_requests.csv'. The logic for assembling the final dataset from the piles is contained in generate_home_assistant_data.py.", "## Generating the dataset from piles\n\n'python3 generate_home_assistant_data.py --train --test --large'\n\nSupported dataset splits are '--test', '--train', & '--sample'\nArguments to set the train dataset size are '--small', '--medium', '--large', & '--xl'.", "## Merging with other instruct-datasets for training\n\n'python3 generate_home_assistant_data.py --merge <dataset>'\n\nSupported datasets right now are: \n- 'alpaca'\n- 'wizardlm70k'\n\nPlease note that the supported datasets all have different licenses. Be aware that the license of the resulting data mixture might be different that the license of this dataset alone." ]
bac472281aec9ff8b553a773c391ea95742ba38b
# Dataset Card for Evaluation run of kwchoi/DPO_mistral_7b_ultra_0129_1k <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [kwchoi/DPO_mistral_7b_ultra_0129_1k](https://huggingface.co/kwchoi/DPO_mistral_7b_ultra_0129_1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kwchoi__DPO_mistral_7b_ultra_0129_1k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T02:14:53.024485](https://huggingface.co/datasets/open-llm-leaderboard/details_kwchoi__DPO_mistral_7b_ultra_0129_1k/blob/main/results_2024-01-29T02-14-53.024485.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6099611898518866, "acc_stderr": 0.033230906844496894, "acc_norm": 0.6150815540917772, "acc_norm_stderr": 0.03391141680931445, "mc1": 0.5287637698898409, "mc1_stderr": 0.017474513848525518, "mc2": 0.6834430823847926, "mc2_stderr": 0.015682529110673984 }, "harness|arc:challenge|25": { "acc": 0.6049488054607508, "acc_stderr": 0.01428589829293817, "acc_norm": 0.6416382252559727, "acc_norm_stderr": 0.014012883334859857 }, "harness|hellaswag|10": { "acc": 0.6885082652857997, "acc_stderr": 0.004621568125102045, "acc_norm": 0.8554072893845848, "acc_norm_stderr": 0.003509709647791843 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368879, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6447368421052632, "acc_stderr": 0.038947344870133176, "acc_norm": 0.6447368421052632, "acc_norm_stderr": 0.038947344870133176 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.02890159361241178, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.02890159361241178 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7013888888888888, "acc_stderr": 0.03827052357950756, "acc_norm": 0.7013888888888888, "acc_norm_stderr": 0.03827052357950756 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.037143259063020656, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.037143259063020656 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6137931034482759, "acc_stderr": 0.04057324734419035, "acc_norm": 0.6137931034482759, "acc_norm_stderr": 0.04057324734419035 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.025010749116137602, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.025010749116137602 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6580645161290323, "acc_stderr": 0.026985289576552732, "acc_norm": 0.6580645161290323, "acc_norm_stderr": 0.026985289576552732 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.0351760354036101, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.0351760354036101 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.62, "acc_stderr": 0.04878317312145632, "acc_norm": 0.62, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7474747474747475, "acc_stderr": 0.030954055470365897, "acc_norm": 0.7474747474747475, "acc_norm_stderr": 0.030954055470365897 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.025787723180723875, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.025787723180723875 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5820512820512821, "acc_stderr": 0.02500732988246122, "acc_norm": 0.5820512820512821, "acc_norm_stderr": 0.02500732988246122 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948492, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6386554621848739, "acc_stderr": 0.03120469122515002, "acc_norm": 0.6386554621848739, "acc_norm_stderr": 0.03120469122515002 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7908256880733945, "acc_stderr": 0.017437937173343233, "acc_norm": 0.7908256880733945, "acc_norm_stderr": 0.017437937173343233 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7352941176470589, "acc_stderr": 0.030964517926923393, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.030964517926923393 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.026750826994676166, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.026750826994676166 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5964125560538116, "acc_stderr": 0.03292802819330313, "acc_norm": 0.5964125560538116, "acc_norm_stderr": 0.03292802819330313 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.040103589424622034, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.034089978868575295, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.04726835553719099, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.04726835553719099 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.776500638569604, "acc_stderr": 0.01489723522945071, "acc_norm": 0.776500638569604, "acc_norm_stderr": 0.01489723522945071 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6820809248554913, "acc_stderr": 0.025070713719153193, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.025070713719153193 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.28268156424581004, "acc_stderr": 0.015060381730018097, "acc_norm": 0.28268156424581004, "acc_norm_stderr": 0.015060381730018097 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6830065359477124, "acc_stderr": 0.02664327847450875, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.02664327847450875 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6816720257234726, "acc_stderr": 0.026457225067811025, "acc_norm": 0.6816720257234726, "acc_norm_stderr": 0.026457225067811025 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6975308641975309, "acc_stderr": 0.025557653981868052, "acc_norm": 0.6975308641975309, "acc_norm_stderr": 0.025557653981868052 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.02975238965742705, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.02975238965742705 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.43546284224250326, "acc_stderr": 0.012663412101248333, "acc_norm": 0.43546284224250326, "acc_norm_stderr": 0.012663412101248333 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.029349803139765873, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.029349803139765873 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.619281045751634, "acc_stderr": 0.019643801557924803, "acc_norm": 0.619281045751634, "acc_norm_stderr": 0.019643801557924803 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.689795918367347, "acc_stderr": 0.029613459872484378, "acc_norm": 0.689795918367347, "acc_norm_stderr": 0.029613459872484378 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7064676616915423, "acc_stderr": 0.03220024104534205, "acc_norm": 0.7064676616915423, "acc_norm_stderr": 0.03220024104534205 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366255, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640038, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640038 }, "harness|truthfulqa:mc|0": { "mc1": 0.5287637698898409, "mc1_stderr": 0.017474513848525518, "mc2": 0.6834430823847926, "mc2_stderr": 0.015682529110673984 }, "harness|winogrande|5": { "acc": 0.7719021310181531, "acc_stderr": 0.011793015817663597 }, "harness|gsm8k|5": { "acc": 0.34950720242608035, "acc_stderr": 0.013133836511705995 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_kwchoi__DPO_mistral_7b_ultra_0129_1k
[ "region:us" ]
2024-01-29T02:10:47+00:00
{"pretty_name": "Evaluation run of kwchoi/DPO_mistral_7b_ultra_0129_1k", "dataset_summary": "Dataset automatically created during the evaluation run of model [kwchoi/DPO_mistral_7b_ultra_0129_1k](https://huggingface.co/kwchoi/DPO_mistral_7b_ultra_0129_1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kwchoi__DPO_mistral_7b_ultra_0129_1k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T02:14:53.024485](https://huggingface.co/datasets/open-llm-leaderboard/details_kwchoi__DPO_mistral_7b_ultra_0129_1k/blob/main/results_2024-01-29T02-14-53.024485.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6099611898518866,\n \"acc_stderr\": 0.033230906844496894,\n \"acc_norm\": 0.6150815540917772,\n \"acc_norm_stderr\": 0.03391141680931445,\n \"mc1\": 0.5287637698898409,\n \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6834430823847926,\n \"mc2_stderr\": 0.015682529110673984\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.01428589829293817,\n \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859857\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6885082652857997,\n \"acc_stderr\": 0.004621568125102045,\n \"acc_norm\": 0.8554072893845848,\n \"acc_norm_stderr\": 0.003509709647791843\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552732,\n \"acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552732\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246122,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246122\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330313,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330313\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153193,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153193\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n \"acc_stderr\": 0.015060381730018097,\n \"acc_norm\": 0.28268156424581004,\n \"acc_norm_stderr\": 0.015060381730018097\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868052,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868052\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n \"acc_stderr\": 0.012663412101248333,\n \"acc_norm\": 0.43546284224250326,\n \"acc_norm_stderr\": 0.012663412101248333\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6834430823847926,\n \"mc2_stderr\": 0.015682529110673984\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663597\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34950720242608035,\n \"acc_stderr\": 0.013133836511705995\n }\n}\n```", "repo_url": "https://huggingface.co/kwchoi/DPO_mistral_7b_ultra_0129_1k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|arc:challenge|25_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|arc:challenge|25_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|gsm8k|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|gsm8k|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hellaswag|10_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hellaswag|10_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T02-08-21.931644.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T02-14-53.024485.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["**/details_harness|winogrande|5_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["**/details_harness|winogrande|5_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T02-14-53.024485.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T02_08_21.931644", "path": ["results_2024-01-29T02-08-21.931644.parquet"]}, {"split": "2024_01_29T02_14_53.024485", "path": ["results_2024-01-29T02-14-53.024485.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T02-14-53.024485.parquet"]}]}]}
2024-01-29T02:17:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kwchoi/DPO_mistral_7b_ultra_0129_1k Dataset automatically created during the evaluation run of model kwchoi/DPO_mistral_7b_ultra_0129_1k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T02:14:53.024485(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of kwchoi/DPO_mistral_7b_ultra_0129_1k\n\n\n\nDataset automatically created during the evaluation run of model kwchoi/DPO_mistral_7b_ultra_0129_1k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T02:14:53.024485(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kwchoi/DPO_mistral_7b_ultra_0129_1k\n\n\n\nDataset automatically created during the evaluation run of model kwchoi/DPO_mistral_7b_ultra_0129_1k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T02:14:53.024485(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0edc276e685ecb2321e47ff655fc4995821236ae
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
ss1997/test-data-for-llm
[ "license:llama2", "region:us" ]
2024-01-29T02:38:57+00:00
{"license": "llama2"}
2024-01-29T03:18:54+00:00
[]
[]
TAGS #license-llama2 #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#license-llama2 #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
8b95a6a8aff5154c32c8a96cfdbac70ec5806342
# Dataset Card for "merge_ko" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
realPCH/merge_ko
[ "region:us" ]
2024-01-29T02:47:17+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 680163705, "num_examples": 339282}], "download_size": 281342251, "dataset_size": 680163705}}
2024-01-29T02:47:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "merge_ko" More Information needed
[ "# Dataset Card for \"merge_ko\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"merge_ko\"\n\nMore Information needed" ]
eda09de067f0e9f0020f68e59566f6f56b9c4fde
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v8 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-v8](https://huggingface.co/CultriX/Wernicke-7B-v8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_CultriX__Wernicke-7B-v8", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T02:45:02.696586](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v8/blob/main/results_2024-01-29T02-45-02.696586.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6522489089327046, "acc_stderr": 0.03211210883644672, "acc_norm": 0.651592492844927, "acc_norm_stderr": 0.032786280036592515, "mc1": 0.5581395348837209, "mc1_stderr": 0.01738476747898621, "mc2": 0.7130370187226833, "mc2_stderr": 0.014785526706273856 }, "harness|arc:challenge|25": { "acc": 0.7022184300341296, "acc_stderr": 0.013363080107244484, "acc_norm": 0.7244027303754266, "acc_norm_stderr": 0.01305716965576184 }, "harness|hellaswag|10": { "acc": 0.7102170882294364, "acc_stderr": 0.004527343651130796, "acc_norm": 0.8869747062338179, "acc_norm_stderr": 0.003159766252456866 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7094339622641509, "acc_stderr": 0.027943219989337135, "acc_norm": 0.7094339622641509, "acc_norm_stderr": 0.027943219989337135 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.049888765156985884, "acc_norm": 0.44, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5829787234042553, "acc_stderr": 0.03223276266711712, "acc_norm": 0.5829787234042553, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.025506481698138215, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.025506481698138215 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.47619047619047616, "acc_stderr": 0.04467062628403273, "acc_norm": 0.47619047619047616, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.02341529343356852, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.02341529343356852 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.032568666616811015, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.032568666616811015 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586815, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34814814814814815, "acc_stderr": 0.029045600290616255, "acc_norm": 0.34814814814814815, "acc_norm_stderr": 0.029045600290616255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49074074074074076, "acc_stderr": 0.034093869469927006, "acc_norm": 0.49074074074074076, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406974, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406974 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368985, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368985 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044287, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.40782122905027934, "acc_stderr": 0.016435865260914746, "acc_norm": 0.40782122905027934, "acc_norm_stderr": 0.016435865260914746 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.025738854797818733, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.025738854797818733 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46740547588005216, "acc_stderr": 0.012743072942653354, "acc_norm": 0.46740547588005216, "acc_norm_stderr": 0.012743072942653354 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6691176470588235, "acc_stderr": 0.028582709753898452, "acc_norm": 0.6691176470588235, "acc_norm_stderr": 0.028582709753898452 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8208955223880597, "acc_stderr": 0.027113286753111837, "acc_norm": 0.8208955223880597, "acc_norm_stderr": 0.027113286753111837 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.0358870281282637, "acc_norm": 0.85, "acc_norm_stderr": 0.0358870281282637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8187134502923976, "acc_stderr": 0.029547741687640044, "acc_norm": 0.8187134502923976, "acc_norm_stderr": 0.029547741687640044 }, "harness|truthfulqa:mc|0": { "mc1": 0.5581395348837209, "mc1_stderr": 0.01738476747898621, "mc2": 0.7130370187226833, "mc2_stderr": 0.014785526706273856 }, "harness|winogrande|5": { "acc": 0.8484609313338595, "acc_stderr": 0.010077698907571778 }, "harness|gsm8k|5": { "acc": 0.6937073540561031, "acc_stderr": 0.012696930106562903 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_CultriX__Wernicke-7B-v8
[ "region:us" ]
2024-01-29T02:47:24+00:00
{"pretty_name": "Evaluation run of CultriX/Wernicke-7B-v8", "dataset_summary": "Dataset automatically created during the evaluation run of model [CultriX/Wernicke-7B-v8](https://huggingface.co/CultriX/Wernicke-7B-v8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__Wernicke-7B-v8\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T02:45:02.696586](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__Wernicke-7B-v8/blob/main/results_2024-01-29T02-45-02.696586.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522489089327046,\n \"acc_stderr\": 0.03211210883644672,\n \"acc_norm\": 0.651592492844927,\n \"acc_norm_stderr\": 0.032786280036592515,\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7130370187226833,\n \"mc2_stderr\": 0.014785526706273856\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244484,\n \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7102170882294364,\n \"acc_stderr\": 0.004527343651130796,\n \"acc_norm\": 0.8869747062338179,\n \"acc_norm_stderr\": 0.003159766252456866\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653354,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653354\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898452,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898452\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7130370187226833,\n \"mc2_stderr\": 0.014785526706273856\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \"acc_stderr\": 0.012696930106562903\n }\n}\n```", "repo_url": "https://huggingface.co/CultriX/Wernicke-7B-v8", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|arc:challenge|25_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|gsm8k|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hellaswag|10_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["**/details_harness|winogrande|5_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T02-45-02.696586.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T02_45_02.696586", "path": ["results_2024-01-29T02-45-02.696586.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T02-45-02.696586.parquet"]}]}]}
2024-01-29T02:47:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v8 Dataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v8 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T02:45:02.696586(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v8\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v8 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T02:45:02.696586(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of CultriX/Wernicke-7B-v8\n\n\n\nDataset automatically created during the evaluation run of model CultriX/Wernicke-7B-v8 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T02:45:02.696586(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
fd60ec98bd460d8ac9ab6e99a20b978b34232ad0
This dataset was created as part of an assignment for my CS-482 course. It was created by following along with the [Hands-on Machine Learning Textbook](https://github.com/ageron/handson-ml3), specifically Jupyter notebook 02.
Raymond-Moody/cs482
[ "size_categories:10K<n<100K", "region:us" ]
2024-01-29T02:49:36+00:00
{"size_categories": ["10K<n<100K"], "dataset_info": {"features": [{"name": "bedrooms__ratio", "dtype": "float64"}, {"name": "rooms_per_house__ratio", "dtype": "float64"}, {"name": "people_per_house__ratio", "dtype": "float64"}, {"name": "total_bedrooms", "dtype": "float64"}, {"name": "total_rooms", "dtype": "float64"}, {"name": "population", "dtype": "float64"}, {"name": "households", "dtype": "float64"}, {"name": "median_income", "dtype": "float64"}, {"name": "Cluster 0 similarity", "dtype": "float64"}, {"name": "Cluster 1 similarity", "dtype": "float64"}, {"name": "Cluster 2 similarity", "dtype": "float64"}, {"name": "Cluster 3 similarity", "dtype": "float64"}, {"name": "Cluster 4 similarity", "dtype": "float64"}, {"name": "Cluster 5 similarity", "dtype": "float64"}, {"name": "Cluster 6 similarity", "dtype": "float64"}, {"name": "Cluster 7 similarity", "dtype": "float64"}, {"name": "Cluster 8 similarity", "dtype": "float64"}, {"name": "Cluster 9 similarity", "dtype": "float64"}, {"name": "ocean_proximity_<1H OCEAN", "dtype": "float64"}, {"name": "ocean_proximity_INLAND", "dtype": "float64"}, {"name": "ocean_proximity_ISLAND", "dtype": "float64"}, {"name": "ocean_proximity_NEAR BAY", "dtype": "float64"}, {"name": "ocean_proximity_NEAR OCEAN", "dtype": "float64"}, {"name": "housing_median_age", "dtype": "float64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3302400, "num_examples": 16512}], "download_size": 2821107, "dataset_size": 3302400}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-29T04:01:37+00:00
[]
[]
TAGS #size_categories-10K<n<100K #region-us
This dataset was created as part of an assignment for my CS-482 course. It was created by following along with the Hands-on Machine Learning Textbook, specifically Jupyter notebook 02.
[]
[ "TAGS\n#size_categories-10K<n<100K #region-us \n" ]
bdbb91a4c24d1224dc64e1d6358f47ce8f9cb470
# Human Genome Vector Embeddings #### by [Attune Engineering](https://attuneengineering.com) --- This dataset contains a curated vector embedding database of the 43,000 protein coding genes of the human genome.
attuneengineering/human-genome-vectordb
[ "language:en", "license:apache-2.0", "biology", "medical", "region:us" ]
2024-01-29T02:56:52+00:00
{"language": ["en"], "license": "apache-2.0", "pretty_name": "Human Genome Vector Embeddings", "tags": ["biology", "medical"]}
2024-01-29T03:05:12+00:00
[]
[ "en" ]
TAGS #language-English #license-apache-2.0 #biology #medical #region-us
# Human Genome Vector Embeddings #### by Attune Engineering --- This dataset contains a curated vector embedding database of the 43,000 protein coding genes of the human genome.
[ "# Human Genome Vector Embeddings", "#### by Attune Engineering\n\n---\n\nThis dataset contains a curated vector embedding database of the 43,000 protein coding genes of the human genome." ]
[ "TAGS\n#language-English #license-apache-2.0 #biology #medical #region-us \n", "# Human Genome Vector Embeddings", "#### by Attune Engineering\n\n---\n\nThis dataset contains a curated vector embedding database of the 43,000 protein coding genes of the human genome." ]
cea43737eb4d4a538d134862ffe4542312707f32
# Dataset Card for "quesst14_all_synth" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Codec-SUPERB/quesst14_all_synth
[ "region:us" ]
2024-01-29T03:46:02+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "original", "path": "data/original-*"}, {"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k_12bps", "path": "data/encodec_24k_12bps-*"}, {"split": "encodec_24k_1_5bps", "path": "data/encodec_24k_1_5bps-*"}, {"split": "encodec_24k_24bps", "path": "data/encodec_24k_24bps-*"}, {"split": "encodec_24k_3bps", "path": "data/encodec_24k_3bps-*"}, {"split": "encodec_24k_6bps", "path": "data/encodec_24k_6bps-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 8000}}}, {"name": "id", "dtype": "string"}], "splits": [{"name": "original", "num_bytes": 1368882918.0, "num_examples": 13607}, {"name": "academicodec_hifi_16k_320d", "num_bytes": 2733824255.0, "num_examples": 13607}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 2733824255.0, "num_examples": 13607}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 4100996735.0, "num_examples": 13607}, {"name": "audiodec_24k_320d", "num_bytes": 4107921615.0, "num_examples": 13607}, {"name": "dac_16k", "num_bytes": 2736769119.0, "num_examples": 13607}, {"name": "dac_24k", "num_bytes": 4104632271.0, "num_examples": 13607}, {"name": "dac_44k", "num_bytes": 7541396965.0, "num_examples": 13607}, {"name": "encodec_24k_12bps", "num_bytes": 4104632271.0, "num_examples": 13607}, {"name": "encodec_24k_1_5bps", "num_bytes": 4104632271.0, "num_examples": 13607}, {"name": "encodec_24k_24bps", "num_bytes": 4104632271.0, "num_examples": 13607}, {"name": "encodec_24k_3bps", "num_bytes": 4104632271.0, "num_examples": 13607}, {"name": "encodec_24k_6bps", "num_bytes": 4104632271.0, "num_examples": 13607}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 2736757881.0, "num_examples": 13607}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 2736757881.0, "num_examples": 13607}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 2737231757.0, "num_examples": 13607}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 2737231757.0, "num_examples": 13607}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 2737231757.0, "num_examples": 13607}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 2737231757.0, "num_examples": 13607}, {"name": "speech_tokenizer_16k", "num_bytes": 2740983853.0, "num_examples": 13607}], "download_size": 18905736290, "dataset_size": 69114836131.0}}
2024-01-29T17:29:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for "quesst14_all_synth" More Information needed
[ "# Dataset Card for \"quesst14_all_synth\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"quesst14_all_synth\"\n\nMore Information needed" ]
2c090cd65792504f0f4fd5edd1879f51562e28ff
This is a subset of [Nectar](https://huggingface.co/datasets/berkeley-nest/Nectar) but in ChatML formatting, with the top ranking response selected as the output. This subset is only the entries with multiple chat turns, with good_natured=True, and that do not begin with 'I'm sorry' (kind of a hack but seems to work)
andysalerno/ansalern-nectar-inputoutput
[ "region:us" ]
2024-01-29T03:47:03+00:00
{"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 78071256, "num_examples": 41539}], "download_size": 37343443, "dataset_size": 78071256}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-30T03:34:53+00:00
[]
[]
TAGS #region-us
This is a subset of Nectar but in ChatML formatting, with the top ranking response selected as the output. This subset is only the entries with multiple chat turns, with good_natured=True, and that do not begin with 'I'm sorry' (kind of a hack but seems to work)
[]
[ "TAGS\n#region-us \n" ]
7e0d4d469cbb2a8ebd0de419e66f7db63c97a75a
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
ombhojane/agortourism
[ "region:us" ]
2024-01-29T04:07:25+00:00
{}
2024-01-29T04:08:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
dbbd3399371a4f17c310f435eb947c13e3f3cc44
# Dataset of Shu/黍 (Arknights) This is the dataset of Shu/黍 (Arknights), containing 43 images and their tags. The core tags of this character are `long_hair, multicolored_hair, horns, blonde_hair, pointy_ears, very_long_hair, blue_hair, earrings, blue_eyes, grey_hair, dragon_horns, hair_between_eyes, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 43 | 96.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shu_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 43 | 44.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shu_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 113 | 97.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shu_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 43 | 80.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shu_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 113 | 155.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shu_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/shu_arknights', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, looking_at_viewer, off_shoulder, smile, solo, upper_body, closed_mouth, collarbone, simple_background, streaked_hair, white_background, gloves, hair_intakes, holding, white_dress, white_hair, long_sleeves, tassel_earrings | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, looking_at_viewer, off_shoulder, solo, closed_mouth, jewelry, smile, white_background, collarbone, long_sleeves, open_jacket, white_dress, simple_background, hair_intakes, hand_up, strapless, white_coat, white_jacket, yellow_gloves | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | looking_at_viewer | off_shoulder | smile | solo | upper_body | closed_mouth | collarbone | simple_background | streaked_hair | white_background | gloves | hair_intakes | holding | white_dress | white_hair | long_sleeves | tassel_earrings | jewelry | open_jacket | hand_up | strapless | white_coat | white_jacket | yellow_gloves | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:---------------|:--------|:-------|:-------------|:---------------|:-------------|:--------------------|:----------------|:-------------------|:---------|:---------------|:----------|:--------------|:-------------|:---------------|:------------------|:----------|:--------------|:----------|:------------|:-------------|:---------------|:----------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | X | X | X | | X | | X | | X | | X | | X | X | X | X | X | X | X |
CyberHarem/shu_arknights
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-01-29T04:41:19+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-29T04:51:49+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Shu/黍 (Arknights) ============================ This is the dataset of Shu/黍 (Arknights), containing 43 images and their tags. The core tags of this character are 'long\_hair, multicolored\_hair, horns, blonde\_hair, pointy\_ears, very\_long\_hair, blue\_hair, earrings, blue\_eyes, grey\_hair, dragon\_horns, hair\_between\_eyes, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
cc52fd6a21400363ca11fb91dde1361ad73d79a4
# Dataset Card for "dpo_data_ultra" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pvduy/dpo_data_ultra
[ "region:us" ]
2024-01-29T04:56:16+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 160175363, "num_examples": 38037}, {"name": "test", "num_bytes": 8556760, "num_examples": 1964}, {"name": "train_prefs", "num_bytes": 160175363, "num_examples": 38037}, {"name": "test_prefs", "num_bytes": 8556760, "num_examples": 1964}], "download_size": 189460772, "dataset_size": 337464246}}
2024-01-29T04:56:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "dpo_data_ultra" More Information needed
[ "# Dataset Card for \"dpo_data_ultra\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"dpo_data_ultra\"\n\nMore Information needed" ]
45e1a0a191138efa5ce225ffd71117128e4eb1db
# Dataset Card for "ingush-russian" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lingtrain/ingush-russian
[ "region:us" ]
2024-01-29T06:26:11+00:00
{"dataset_info": {"features": [{"name": "ing", "dtype": "string"}, {"name": "ru", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1552621, "num_examples": 5492}], "download_size": 791728, "dataset_size": 1552621}}
2024-01-29T06:26:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "ingush-russian" More Information needed
[ "# Dataset Card for \"ingush-russian\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"ingush-russian\"\n\nMore Information needed" ]
296c9a27ea62dcb1ebf72b9253fd84b199101d89
# Dataset Card for "chechen-russian" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lingtrain/chechen-russian
[ "region:us" ]
2024-01-29T06:31:51+00:00
{"dataset_info": {"features": [{"name": "che", "dtype": "string"}, {"name": "ru", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13746785, "num_examples": 31064}], "download_size": 0, "dataset_size": 13746785}}
2024-01-29T06:32:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for "chechen-russian" More Information needed
[ "# Dataset Card for \"chechen-russian\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"chechen-russian\"\n\nMore Information needed" ]
9e5c5c874217cb666dde055a4b67ee31c195610b
# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch1 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [vilm/Mixsmol-4x400M-v0.1-epoch1](https://huggingface.co/vilm/Mixsmol-4x400M-v0.1-epoch1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T06:43:16.109579](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch1/blob/main/results_2024-01-29T06-43-16.109579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.2528141388237338, "acc_stderr": 0.03070376097869733, "acc_norm": 0.2532398880953499, "acc_norm_stderr": 0.03150013405049895, "mc1": 0.20685434516523868, "mc1_stderr": 0.014179591496728343, "mc2": 0.39031505296680813, "mc2_stderr": 0.014793512649281262 }, "harness|arc:challenge|25": { "acc": 0.19965870307167236, "acc_stderr": 0.01168162575688867, "acc_norm": 0.22866894197952217, "acc_norm_stderr": 0.012272853582540802 }, "harness|hellaswag|10": { "acc": 0.285700059749054, "acc_stderr": 0.004508239594503833, "acc_norm": 0.3057159928301135, "acc_norm_stderr": 0.004597684609707824 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.22962962962962963, "acc_stderr": 0.03633384414073465, "acc_norm": 0.22962962962962963, "acc_norm_stderr": 0.03633384414073465 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.025288394502891366, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.025288394502891366 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.24305555555555555, "acc_stderr": 0.03586879280080341, "acc_norm": 0.24305555555555555, "acc_norm_stderr": 0.03586879280080341 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.23, "acc_stderr": 0.04229525846816506, "acc_norm": 0.23, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.2774566473988439, "acc_stderr": 0.03414014007044036, "acc_norm": 0.2774566473988439, "acc_norm_stderr": 0.03414014007044036 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.2549019607843137, "acc_stderr": 0.04336432707993179, "acc_norm": 0.2549019607843137, "acc_norm_stderr": 0.04336432707993179 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.2851063829787234, "acc_stderr": 0.029513196625539355, "acc_norm": 0.2851063829787234, "acc_norm_stderr": 0.029513196625539355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.21929824561403508, "acc_stderr": 0.03892431106518754, "acc_norm": 0.21929824561403508, "acc_norm_stderr": 0.03892431106518754 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.25517241379310346, "acc_stderr": 0.03632984052707841, "acc_norm": 0.25517241379310346, "acc_norm_stderr": 0.03632984052707841 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.25396825396825395, "acc_stderr": 0.022418042891113942, "acc_norm": 0.25396825396825395, "acc_norm_stderr": 0.022418042891113942 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.15873015873015872, "acc_stderr": 0.03268454013011743, "acc_norm": 0.15873015873015872, "acc_norm_stderr": 0.03268454013011743 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.19, "acc_stderr": 0.03942772444036624, "acc_norm": 0.19, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3032258064516129, "acc_stderr": 0.02614868593067175, "acc_norm": 0.3032258064516129, "acc_norm_stderr": 0.02614868593067175 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.29064039408866993, "acc_stderr": 0.0319474007226554, "acc_norm": 0.29064039408866993, "acc_norm_stderr": 0.0319474007226554 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.27, "acc_stderr": 0.044619604333847394, "acc_norm": 0.27, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03225078108306289, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.25252525252525254, "acc_stderr": 0.030954055470365897, "acc_norm": 0.25252525252525254, "acc_norm_stderr": 0.030954055470365897 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.35751295336787564, "acc_stderr": 0.034588160421810045, "acc_norm": 0.35751295336787564, "acc_norm_stderr": 0.034588160421810045 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.2358974358974359, "acc_stderr": 0.021525965407408726, "acc_norm": 0.2358974358974359, "acc_norm_stderr": 0.021525965407408726 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3319327731092437, "acc_stderr": 0.030588697013783663, "acc_norm": 0.3319327731092437, "acc_norm_stderr": 0.030588697013783663 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.23178807947019867, "acc_stderr": 0.03445406271987054, "acc_norm": 0.23178807947019867, "acc_norm_stderr": 0.03445406271987054 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.21651376146788992, "acc_stderr": 0.017658710594443135, "acc_norm": 0.21651376146788992, "acc_norm_stderr": 0.017658710594443135 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4444444444444444, "acc_stderr": 0.03388857118502325, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.03388857118502325 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.24509803921568626, "acc_stderr": 0.03019028245350195, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.22784810126582278, "acc_stderr": 0.027303484599069425, "acc_norm": 0.22784810126582278, "acc_norm_stderr": 0.027303484599069425 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.20179372197309417, "acc_stderr": 0.026936111912802273, "acc_norm": 0.20179372197309417, "acc_norm_stderr": 0.026936111912802273 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.25190839694656486, "acc_stderr": 0.03807387116306086, "acc_norm": 0.25190839694656486, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.36363636363636365, "acc_stderr": 0.04391326286724071, "acc_norm": 0.36363636363636365, "acc_norm_stderr": 0.04391326286724071 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25, "acc_stderr": 0.04186091791394607, "acc_norm": 0.25, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.25153374233128833, "acc_stderr": 0.034089978868575295, "acc_norm": 0.25153374233128833, "acc_norm_stderr": 0.034089978868575295 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.23214285714285715, "acc_stderr": 0.04007341809755806, "acc_norm": 0.23214285714285715, "acc_norm_stderr": 0.04007341809755806 }, "harness|hendrycksTest-management|5": { "acc": 0.1941747572815534, "acc_stderr": 0.03916667762822585, "acc_norm": 0.1941747572815534, "acc_norm_stderr": 0.03916667762822585 }, "harness|hendrycksTest-marketing|5": { "acc": 0.20512820512820512, "acc_stderr": 0.02645350805404035, "acc_norm": 0.20512820512820512, "acc_norm_stderr": 0.02645350805404035 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.2784163473818646, "acc_stderr": 0.016028295188992455, "acc_norm": 0.2784163473818646, "acc_norm_stderr": 0.016028295188992455 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.22254335260115607, "acc_stderr": 0.02239421566194282, "acc_norm": 0.22254335260115607, "acc_norm_stderr": 0.02239421566194282 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961454, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961454 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.19934640522875818, "acc_stderr": 0.022875816993464068, "acc_norm": 0.19934640522875818, "acc_norm_stderr": 0.022875816993464068 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.20257234726688103, "acc_stderr": 0.02282731749105968, "acc_norm": 0.20257234726688103, "acc_norm_stderr": 0.02282731749105968 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.2716049382716049, "acc_stderr": 0.02474862449053737, "acc_norm": 0.2716049382716049, "acc_norm_stderr": 0.02474862449053737 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24468085106382978, "acc_stderr": 0.02564555362226673, "acc_norm": 0.24468085106382978, "acc_norm_stderr": 0.02564555362226673 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.29411764705882354, "acc_stderr": 0.0276784686421447, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.0276784686421447 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2565359477124183, "acc_stderr": 0.017667841612378984, "acc_norm": 0.2565359477124183, "acc_norm_stderr": 0.017667841612378984 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.20909090909090908, "acc_stderr": 0.038950910157241364, "acc_norm": 0.20909090909090908, "acc_norm_stderr": 0.038950910157241364 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.2612244897959184, "acc_stderr": 0.028123429335142783, "acc_norm": 0.2612244897959184, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.2537313432835821, "acc_stderr": 0.030769444967296018, "acc_norm": 0.2537313432835821, "acc_norm_stderr": 0.030769444967296018 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-virology|5": { "acc": 0.19879518072289157, "acc_stderr": 0.031069390260789424, "acc_norm": 0.19879518072289157, "acc_norm_stderr": 0.031069390260789424 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.22807017543859648, "acc_stderr": 0.03218093795602357, "acc_norm": 0.22807017543859648, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.20685434516523868, "mc1_stderr": 0.014179591496728343, "mc2": 0.39031505296680813, "mc2_stderr": 0.014793512649281262 }, "harness|winogrande|5": { "acc": 0.5280189423835833, "acc_stderr": 0.014030404213405784 }, "harness|gsm8k|5": { "acc": 0.001516300227445034, "acc_stderr": 0.0010717793485492627 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch1
[ "region:us" ]
2024-01-29T06:45:35+00:00
{"pretty_name": "Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch1", "dataset_summary": "Dataset automatically created during the evaluation run of model [vilm/Mixsmol-4x400M-v0.1-epoch1](https://huggingface.co/vilm/Mixsmol-4x400M-v0.1-epoch1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T06:43:16.109579](https://huggingface.co/datasets/open-llm-leaderboard/details_vilm__Mixsmol-4x400M-v0.1-epoch1/blob/main/results_2024-01-29T06-43-16.109579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2528141388237338,\n \"acc_stderr\": 0.03070376097869733,\n \"acc_norm\": 0.2532398880953499,\n \"acc_norm_stderr\": 0.03150013405049895,\n \"mc1\": 0.20685434516523868,\n \"mc1_stderr\": 0.014179591496728343,\n \"mc2\": 0.39031505296680813,\n \"mc2_stderr\": 0.014793512649281262\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.01168162575688867,\n \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540802\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.285700059749054,\n \"acc_stderr\": 0.004508239594503833,\n \"acc_norm\": 0.3057159928301135,\n \"acc_norm_stderr\": 0.004597684609707824\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073465,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073465\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891366,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993179,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993179\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707841,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707841\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.3032258064516129,\n \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3319327731092437,\n \"acc_stderr\": 0.030588697013783663,\n \"acc_norm\": 0.3319327731092437,\n \"acc_norm_stderr\": 0.030588697013783663\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21651376146788992,\n \"acc_stderr\": 0.017658710594443135,\n \"acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.017658710594443135\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22784810126582278,\n \"acc_stderr\": 0.027303484599069425,\n \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.027303484599069425\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.02645350805404035,\n \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.02645350805404035\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2784163473818646,\n \"acc_stderr\": 0.016028295188992455,\n \"acc_norm\": 0.2784163473818646,\n \"acc_norm_stderr\": 0.016028295188992455\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961454,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961454\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.19934640522875818,\n \"acc_stderr\": 0.022875816993464068,\n \"acc_norm\": 0.19934640522875818,\n \"acc_norm_stderr\": 0.022875816993464068\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n \"acc_stderr\": 0.02282731749105968,\n \"acc_norm\": 0.20257234726688103,\n \"acc_norm_stderr\": 0.02282731749105968\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24468085106382978,\n \"acc_stderr\": 0.02564555362226673,\n \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.02564555362226673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.0276784686421447,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.0276784686421447\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2612244897959184,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.2612244897959184,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n \"acc_stderr\": 0.031069390260789424,\n \"acc_norm\": 0.19879518072289157,\n \"acc_norm_stderr\": 0.031069390260789424\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20685434516523868,\n \"mc1_stderr\": 0.014179591496728343,\n \"mc2\": 0.39031505296680813,\n \"mc2_stderr\": 0.014793512649281262\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5280189423835833,\n \"acc_stderr\": 0.014030404213405784\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492627\n }\n}\n```", "repo_url": "https://huggingface.co/vilm/Mixsmol-4x400M-v0.1-epoch1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|arc:challenge|25_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|gsm8k|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hellaswag|10_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T06-43-16.109579.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["**/details_harness|winogrande|5_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T06-43-16.109579.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T06_43_16.109579", "path": ["results_2024-01-29T06-43-16.109579.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T06-43-16.109579.parquet"]}]}]}
2024-01-29T06:45:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch1 Dataset automatically created during the evaluation run of model vilm/Mixsmol-4x400M-v0.1-epoch1 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T06:43:16.109579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch1\n\n\n\nDataset automatically created during the evaluation run of model vilm/Mixsmol-4x400M-v0.1-epoch1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T06:43:16.109579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of vilm/Mixsmol-4x400M-v0.1-epoch1\n\n\n\nDataset automatically created during the evaluation run of model vilm/Mixsmol-4x400M-v0.1-epoch1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T06:43:16.109579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
df50f0f33dd25cce2d334eb6cf87f4ae68667f6a
German translation of [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs) Using azureml for translation and [hermeo-7b](https://huggingface.co/malteos/hermeo-7b) for rejected answers.
mayflowergmbh/intel_orca_dpo_pairs_de
[ "task_categories:text-generation", "language:de", "license:apache-2.0", "dpo", "region:us" ]
2024-01-29T06:47:20+00:00
{"language": ["de"], "license": "apache-2.0", "task_categories": ["text-generation"], "tags": ["dpo"]}
2024-02-12T09:21:25+00:00
[]
[ "de" ]
TAGS #task_categories-text-generation #language-German #license-apache-2.0 #dpo #region-us
German translation of Intel/orca_dpo_pairs Using azureml for translation and hermeo-7b for rejected answers.
[]
[ "TAGS\n#task_categories-text-generation #language-German #license-apache-2.0 #dpo #region-us \n" ]
b50253c325a97a4ae1d6026ed718ce0757f7dced
# Dataset Card for "sanskrit-russian" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lingtrain/sanskrit-russian
[ "region:us" ]
2024-01-29T06:50:56+00:00
{"dataset_info": {"features": [{"name": "ru", "dtype": "string"}, {"name": "san", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13189111, "num_examples": 32669}], "download_size": 7022744, "dataset_size": 13189111}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-06T08:27:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "sanskrit-russian" More Information needed
[ "# Dataset Card for \"sanskrit-russian\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"sanskrit-russian\"\n\nMore Information needed" ]
cdbb570f8e1940f3a60f55fc031bec8b91696571
Creating a dataset card for "EffiBench: Benchmarking the Efficiency of Code Generated by Large Language Models" on Hugging Face involves providing detailed and structured information about the dataset. Here's a template that you can use and modify according to the specifics of your dataset: ```markdown # Dataset Card for "EffiBench: Benchmarking the Efficiency of Code Generated by Large Language Models" ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contact Information](#contact-information) - [Acknowledgements](#acknowledgements) --- ## Dataset Description ### Dataset Summary The EffiBench dataset, released by EffiBench, includes 1000 efficiency-critical problems designed to benchmark the performance of large language models in generating code. These problems cover a wide range of programming tasks, emphasizing the evaluation of computational efficiency, algorithmic optimization, and resource management in generated code. The dataset is intended for researchers and practitioners focusing on the development and assessment of advanced code generation models, especially in the context of AI-driven software engineering. ### Supported Tasks and Leaderboards EffiBench can be used for the following tasks: - **Code Generation**: Assessing the ability of language models to generate syntactically and semantically correct code. - **Efficiency Evaluation**: Evaluating the computational efficiency of the generated code, including time and memory usage. - **Algorithmic Optimization**: Benchmarking models' capability in optimizing algorithms for performance. Leaderboards could be established based on metrics like code efficiency, correctness, and resource utilization. ### Languages The problems in EffiBench are presented in [Programming Language(s)], suitable for a global audience of AI researchers and developers. ## Dataset Structure ### Data Instances An example data instance from EffiBench might look like this: ```json { "problem_id": "001", "problem_statement": "Write a function to optimize...", "input_format": "...", "output_format": "...", "constraints": "...", "example_input": "...", "example_output": "...", "optimal_solution": "..." } ``` ### Data Fields - `problem_id`: a unique identifier for each problem. - `problem_statement`: a detailed description of the problem. - `input_format`: format of the input data. - `output_format`: expected format of the output. - `constraints`: specific constraints for the problem. - `example_input`: an example input for the problem. - `example_output`: corresponding output for the example input. - `optimal_solution`: an optimal solution for the problem, used as a benchmark. ### Data Splits EffiBench is divided into the following splits: - Training: X problems - Validation: Y problems - Test: Z problems ## Dataset Creation ### Curation Rationale [Explanation about why and how the dataset was created.] ### Source Data #### Initial Data Collection and Normalization [Description of the data collection process.] #### Who are the source language producers? [Information about the individuals or organizations that created the original language data.] ### Annotations #### Annotation process [Description of the process used to create the annotations.] #### Who are the annotators? [Information about the individuals or groups who provided annotations, if applicable.] ### Personal and Sensitive Information [Details about any personal or sensitive information contained in the dataset and measures taken to protect privacy.] ## Considerations for Using the Data ### Social Impact of Dataset [Discussion on the potential social impact of the dataset, positive or negative.] ### Discussion of Biases [Discussion on any biases present in the dataset and their potential impact.] ### Other Known Limitations [Discussion on other known limitations of the dataset.] ## Additional Information ### Dataset Curators [Information about the individuals or organizations who curated the dataset.] ### Licensing Information [Information about the licensing of the dataset (e.g., MIT, Apache 2.0).] ### Citation Information ```bibtex @inproceedings{effibench2024, title={EffiBench: Benchmarking the Efficiency of Code Generated by Large Language Models}, author={...}, booktitle={...}, year={2024} } ``` ### Contact Information [Contact details for questions or feedback regarding the dataset.] ### Acknowledgements [Acknowledgements to individuals or organizations who contributed to the dataset.] ``` Remember to fill in the placeholders (like `[Programming Language(s)]`, `X problems`, `Y problems`, `Z problems`, etc.) with the specific details of your dataset. This template covers the essential elements of a Hugging Face dataset card, but you can add more sections if necessary to provide a complete picture of your dataset. [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DONG19/EffiBench
[ "region:us" ]
2024-01-29T06:56:43+00:00
{"dataset_info": {"features": [{"name": "problem_idx", "dtype": "int64"}, {"name": "task_name", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "markdown_description", "dtype": "string"}, {"name": "canonical_solution", "dtype": "string"}, {"name": "test_case_generator", "dtype": "string"}, {"name": "test_case", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 86588402, "num_examples": 1000}], "download_size": 50403035, "dataset_size": 86588402}}
2024-01-29T07:15:03+00:00
[]
[]
TAGS #region-us
Creating a dataset card for "EffiBench: Benchmarking the Efficiency of Code Generated by Large Language Models" on Hugging Face involves providing detailed and structured information about the dataset. Here's a template that you can use and modify according to the specifics of your dataset: json { "problem_id": "001", "problem_statement": "Write a function to optimize...", "input_format": "...", "output_format": "...", "constraints": "...", "example_input": "...", "example_output": "...", "optimal_solution": "..." } bibtex @inproceedings{effibench2024, title={EffiBench: Benchmarking the Efficiency of Code Generated by Large Language Models}, author={...}, booktitle={...}, year={2024} } Remember to fill in the placeholders (like '[Programming Language(s)]', 'X problems', 'Y problems', 'Z problems', etc.) with the specific details of your dataset. This template covers the essential elements of a Hugging Face dataset card, but you can add more sections if necessary to provide a complete picture of your dataset. More Information needed
[]
[ "TAGS\n#region-us \n" ]
5fd7cd2ebc7ff2d8bc2b32c5c4ff51268062300b
# Dataset Card for Evaluation run of ibivibiv/multimaster-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [ibivibiv/multimaster-7b](https://huggingface.co/ibivibiv/multimaster-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_ibivibiv__multimaster-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T06:57:35.053751](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b/blob/main/results_2024-01-29T06-57-35.053751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4672294483827764, "acc_stderr": 0.03460052472829534, "acc_norm": 0.4730946802206324, "acc_norm_stderr": 0.03539528166637499, "mc1": 0.30599755201958384, "mc1_stderr": 0.016132229728155038, "mc2": 0.4497774423277904, "mc2_stderr": 0.015449037055535802 }, "harness|arc:challenge|25": { "acc": 0.3728668941979522, "acc_stderr": 0.014131176760131169, "acc_norm": 0.4104095563139932, "acc_norm_stderr": 0.014374922192642666 }, "harness|hellaswag|10": { "acc": 0.5748854809798845, "acc_stderr": 0.004933500261683597, "acc_norm": 0.7499502091216889, "acc_norm_stderr": 0.004321564303822423 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.42962962962962964, "acc_stderr": 0.04276349494376599, "acc_norm": 0.42962962962962964, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.45394736842105265, "acc_stderr": 0.04051646342874143, "acc_norm": 0.45394736842105265, "acc_norm_stderr": 0.04051646342874143 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5320754716981132, "acc_stderr": 0.03070948699255655, "acc_norm": 0.5320754716981132, "acc_norm_stderr": 0.03070948699255655 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4305555555555556, "acc_stderr": 0.04140685639111503, "acc_norm": 0.4305555555555556, "acc_norm_stderr": 0.04140685639111503 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.049236596391733084, "acc_norm": 0.4, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621505, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.42196531791907516, "acc_stderr": 0.0376574669386515, "acc_norm": 0.42196531791907516, "acc_norm_stderr": 0.0376574669386515 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.22549019607843138, "acc_stderr": 0.041583075330832865, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.041583075330832865 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4085106382978723, "acc_stderr": 0.03213418026701576, "acc_norm": 0.4085106382978723, "acc_norm_stderr": 0.03213418026701576 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.30701754385964913, "acc_stderr": 0.04339138322579861, "acc_norm": 0.30701754385964913, "acc_norm_stderr": 0.04339138322579861 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.43448275862068964, "acc_stderr": 0.04130740879555497, "acc_norm": 0.43448275862068964, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3412698412698413, "acc_stderr": 0.02441923496681907, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.02441923496681907 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30158730158730157, "acc_stderr": 0.04104947269903394, "acc_norm": 0.30158730158730157, "acc_norm_stderr": 0.04104947269903394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5096774193548387, "acc_stderr": 0.02843867799890955, "acc_norm": 0.5096774193548387, "acc_norm_stderr": 0.02843867799890955 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.35960591133004927, "acc_stderr": 0.033764582465095665, "acc_norm": 0.35960591133004927, "acc_norm_stderr": 0.033764582465095665 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5575757575757576, "acc_stderr": 0.03878372113711274, "acc_norm": 0.5575757575757576, "acc_norm_stderr": 0.03878372113711274 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6414141414141414, "acc_stderr": 0.034169036403915214, "acc_norm": 0.6414141414141414, "acc_norm_stderr": 0.034169036403915214 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.5233160621761658, "acc_stderr": 0.03604513672442202, "acc_norm": 0.5233160621761658, "acc_norm_stderr": 0.03604513672442202 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.43333333333333335, "acc_stderr": 0.025124653525885127, "acc_norm": 0.43333333333333335, "acc_norm_stderr": 0.025124653525885127 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.02784081149587194, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.02784081149587194 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4369747899159664, "acc_stderr": 0.03221943636566196, "acc_norm": 0.4369747899159664, "acc_norm_stderr": 0.03221943636566196 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6311926605504588, "acc_stderr": 0.020686227560729548, "acc_norm": 0.6311926605504588, "acc_norm_stderr": 0.020686227560729548 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4351851851851852, "acc_stderr": 0.033812000056435254, "acc_norm": 0.4351851851851852, "acc_norm_stderr": 0.033812000056435254 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5392156862745098, "acc_stderr": 0.03498501649369527, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.03498501649369527 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6286919831223629, "acc_stderr": 0.03145068600744859, "acc_norm": 0.6286919831223629, "acc_norm_stderr": 0.03145068600744859 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.48878923766816146, "acc_stderr": 0.033549366530984746, "acc_norm": 0.48878923766816146, "acc_norm_stderr": 0.033549366530984746 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5419847328244275, "acc_stderr": 0.04369802690578756, "acc_norm": 0.5419847328244275, "acc_norm_stderr": 0.04369802690578756 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6198347107438017, "acc_stderr": 0.04431324501968431, "acc_norm": 0.6198347107438017, "acc_norm_stderr": 0.04431324501968431 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5925925925925926, "acc_stderr": 0.047500773411999854, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.047500773411999854 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5460122699386503, "acc_stderr": 0.0391170190467718, "acc_norm": 0.5460122699386503, "acc_norm_stderr": 0.0391170190467718 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.32142857142857145, "acc_stderr": 0.044328040552915185, "acc_norm": 0.32142857142857145, "acc_norm_stderr": 0.044328040552915185 }, "harness|hendrycksTest-management|5": { "acc": 0.6019417475728155, "acc_stderr": 0.04846748253977238, "acc_norm": 0.6019417475728155, "acc_norm_stderr": 0.04846748253977238 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7393162393162394, "acc_stderr": 0.02876034895652341, "acc_norm": 0.7393162393162394, "acc_norm_stderr": 0.02876034895652341 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6424010217113666, "acc_stderr": 0.017139488998803288, "acc_norm": 0.6424010217113666, "acc_norm_stderr": 0.017139488998803288 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.45375722543352603, "acc_stderr": 0.026803720583206184, "acc_norm": 0.45375722543352603, "acc_norm_stderr": 0.026803720583206184 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24134078212290502, "acc_stderr": 0.014310999547961452, "acc_norm": 0.24134078212290502, "acc_norm_stderr": 0.014310999547961452 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.49673202614379086, "acc_stderr": 0.028629305194003543, "acc_norm": 0.49673202614379086, "acc_norm_stderr": 0.028629305194003543 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5594855305466238, "acc_stderr": 0.028196400574197422, "acc_norm": 0.5594855305466238, "acc_norm_stderr": 0.028196400574197422 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5555555555555556, "acc_stderr": 0.02764847787741332, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.02764847787741332 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3475177304964539, "acc_stderr": 0.028406627809590947, "acc_norm": 0.3475177304964539, "acc_norm_stderr": 0.028406627809590947 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.35919165580182527, "acc_stderr": 0.012253386187584253, "acc_norm": 0.35919165580182527, "acc_norm_stderr": 0.012253386187584253 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.41911764705882354, "acc_stderr": 0.029972807170464622, "acc_norm": 0.41911764705882354, "acc_norm_stderr": 0.029972807170464622 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.44281045751633985, "acc_stderr": 0.020095083154577358, "acc_norm": 0.44281045751633985, "acc_norm_stderr": 0.020095083154577358 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5, "acc_stderr": 0.04789131426105757, "acc_norm": 0.5, "acc_norm_stderr": 0.04789131426105757 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5020408163265306, "acc_stderr": 0.0320089533497105, "acc_norm": 0.5020408163265306, "acc_norm_stderr": 0.0320089533497105 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6169154228855721, "acc_stderr": 0.034375193373382504, "acc_norm": 0.6169154228855721, "acc_norm_stderr": 0.034375193373382504 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6491228070175439, "acc_stderr": 0.03660298834049162, "acc_norm": 0.6491228070175439, "acc_norm_stderr": 0.03660298834049162 }, "harness|truthfulqa:mc|0": { "mc1": 0.30599755201958384, "mc1_stderr": 0.016132229728155038, "mc2": 0.4497774423277904, "mc2_stderr": 0.015449037055535802 }, "harness|winogrande|5": { "acc": 0.6835043409629045, "acc_stderr": 0.01307186832805148 }, "harness|gsm8k|5": { "acc": 0.11751326762699014, "acc_stderr": 0.008870331256489969 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_ibivibiv__multimaster-7b
[ "region:us" ]
2024-01-29T06:59:56+00:00
{"pretty_name": "Evaluation run of ibivibiv/multimaster-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ibivibiv/multimaster-7b](https://huggingface.co/ibivibiv/multimaster-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__multimaster-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T06:57:35.053751](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b/blob/main/results_2024-01-29T06-57-35.053751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4672294483827764,\n \"acc_stderr\": 0.03460052472829534,\n \"acc_norm\": 0.4730946802206324,\n \"acc_norm_stderr\": 0.03539528166637499,\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155038,\n \"mc2\": 0.4497774423277904,\n \"mc2_stderr\": 0.015449037055535802\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.014131176760131169,\n \"acc_norm\": 0.4104095563139932,\n \"acc_norm_stderr\": 0.014374922192642666\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5748854809798845,\n \"acc_stderr\": 0.004933500261683597,\n \"acc_norm\": 0.7499502091216889,\n \"acc_norm_stderr\": 0.004321564303822423\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.5096774193548387,\n \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.03878372113711274,\n \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.03878372113711274\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.034169036403915214,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.034169036403915214\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.025124653525885127,\n \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.025124653525885127\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587194,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587194\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729548,\n \"acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729548\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.03498501649369527,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.03498501649369527\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.48878923766816146,\n \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.48878923766816146,\n \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977238,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977238\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n \"acc_stderr\": 0.02876034895652341,\n \"acc_norm\": 0.7393162393162394,\n \"acc_norm_stderr\": 0.02876034895652341\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6424010217113666,\n \"acc_stderr\": 0.017139488998803288,\n \"acc_norm\": 0.6424010217113666,\n \"acc_norm_stderr\": 0.017139488998803288\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.45375722543352603,\n \"acc_stderr\": 0.026803720583206184,\n \"acc_norm\": 0.45375722543352603,\n \"acc_norm_stderr\": 0.026803720583206184\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961452,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961452\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n \"acc_stderr\": 0.028196400574197422,\n \"acc_norm\": 0.5594855305466238,\n \"acc_norm_stderr\": 0.028196400574197422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02764847787741332,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02764847787741332\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590947,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590947\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35919165580182527,\n \"acc_stderr\": 0.012253386187584253,\n \"acc_norm\": 0.35919165580182527,\n \"acc_norm_stderr\": 0.012253386187584253\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.44281045751633985,\n \"acc_stderr\": 0.020095083154577358,\n \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.020095083154577358\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n \"acc_stderr\": 0.034375193373382504,\n \"acc_norm\": 0.6169154228855721,\n \"acc_norm_stderr\": 0.034375193373382504\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049162,\n \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049162\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155038,\n \"mc2\": 0.4497774423277904,\n \"mc2_stderr\": 0.015449037055535802\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6835043409629045,\n \"acc_stderr\": 0.01307186832805148\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \"acc_stderr\": 0.008870331256489969\n }\n}\n```", "repo_url": "https://huggingface.co/ibivibiv/multimaster-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|arc:challenge|25_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|gsm8k|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hellaswag|10_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T06-57-35.053751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["**/details_harness|winogrande|5_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T06-57-35.053751.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T06_57_35.053751", "path": ["results_2024-01-29T06-57-35.053751.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T06-57-35.053751.parquet"]}]}]}
2024-01-29T07:00:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of ibivibiv/multimaster-7b Dataset automatically created during the evaluation run of model ibivibiv/multimaster-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T06:57:35.053751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of ibivibiv/multimaster-7b\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/multimaster-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T06:57:35.053751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of ibivibiv/multimaster-7b\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/multimaster-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T06:57:35.053751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b136f31e714339d4dac4643a1db3c0590c7ff904
# Dataset Card for Evaluation run of dddsaty/SOLAR-Instruct-ko-Adapter-Attach <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [dddsaty/SOLAR-Instruct-ko-Adapter-Attach](https://huggingface.co/dddsaty/SOLAR-Instruct-ko-Adapter-Attach) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_dddsaty__SOLAR-Instruct-ko-Adapter-Attach", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T07:15:19.407850](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__SOLAR-Instruct-ko-Adapter-Attach/blob/main/results_2024-01-29T07-15-19.407850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6645870295864068, "acc_stderr": 0.03171088413744625, "acc_norm": 0.665494569281768, "acc_norm_stderr": 0.03235701825353022, "mc1": 0.5667074663402693, "mc1_stderr": 0.017347024450107485, "mc2": 0.7150654372668831, "mc2_stderr": 0.0150179447040812 }, "harness|arc:challenge|25": { "acc": 0.6825938566552902, "acc_stderr": 0.013602239088038167, "acc_norm": 0.7107508532423208, "acc_norm_stderr": 0.013250012579393441 }, "harness|hellaswag|10": { "acc": 0.7078271260705039, "acc_stderr": 0.004538319464111956, "acc_norm": 0.8819956184027086, "acc_norm_stderr": 0.003219539790500477 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.42, "acc_stderr": 0.04960449637488583, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488583 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7368421052631579, "acc_stderr": 0.03583496176361072, "acc_norm": 0.7368421052631579, "acc_norm_stderr": 0.03583496176361072 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.0440844002276808, "acc_norm": 0.74, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6792452830188679, "acc_stderr": 0.02872750295788027, "acc_norm": 0.6792452830188679, "acc_norm_stderr": 0.02872750295788027 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6297872340425532, "acc_stderr": 0.03156564682236785, "acc_norm": 0.6297872340425532, "acc_norm_stderr": 0.03156564682236785 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5, "acc_stderr": 0.047036043419179864, "acc_norm": 0.5, "acc_norm_stderr": 0.047036043419179864 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6275862068965518, "acc_stderr": 0.04028731532947558, "acc_norm": 0.6275862068965518, "acc_norm_stderr": 0.04028731532947558 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47883597883597884, "acc_stderr": 0.025728230952130726, "acc_norm": 0.47883597883597884, "acc_norm_stderr": 0.025728230952130726 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8032258064516129, "acc_stderr": 0.022616409420742025, "acc_norm": 0.8032258064516129, "acc_norm_stderr": 0.022616409420742025 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721175, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721175 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8737373737373737, "acc_stderr": 0.02366435940288023, "acc_norm": 0.8737373737373737, "acc_norm_stderr": 0.02366435940288023 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3814814814814815, "acc_stderr": 0.029616718927497593, "acc_norm": 0.3814814814814815, "acc_norm_stderr": 0.029616718927497593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634332, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634332 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242741, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242741 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5601851851851852, "acc_stderr": 0.0338517797604481, "acc_norm": 0.5601851851851852, "acc_norm_stderr": 0.0338517797604481 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.025524722324553353, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.025524722324553353 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8607594936708861, "acc_stderr": 0.022535526352692705, "acc_norm": 0.8607594936708861, "acc_norm_stderr": 0.022535526352692705 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.03844876139785271, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.03760178006026621, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.03760178006026621 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.0230866350868414, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.0230866350868414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8007662835249042, "acc_stderr": 0.01428337804429642, "acc_norm": 0.8007662835249042, "acc_norm_stderr": 0.01428337804429642 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7601156069364162, "acc_stderr": 0.022989592543123567, "acc_norm": 0.7601156069364162, "acc_norm_stderr": 0.022989592543123567 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38994413407821227, "acc_stderr": 0.01631237662921307, "acc_norm": 0.38994413407821227, "acc_norm_stderr": 0.01631237662921307 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7581699346405228, "acc_stderr": 0.024518195641879334, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.024518195641879334 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7331189710610932, "acc_stderr": 0.025122637608816646, "acc_norm": 0.7331189710610932, "acc_norm_stderr": 0.025122637608816646 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7870370370370371, "acc_stderr": 0.022779719088733396, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.022779719088733396 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.49282920469361147, "acc_stderr": 0.012768922739553308, "acc_norm": 0.49282920469361147, "acc_norm_stderr": 0.012768922739553308 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7426470588235294, "acc_stderr": 0.0265565194700415, "acc_norm": 0.7426470588235294, "acc_norm_stderr": 0.0265565194700415 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6911764705882353, "acc_stderr": 0.018690850273595294, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.018690850273595294 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.0282638899437846, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.0282638899437846 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598052, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598052 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03188578017686398, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03188578017686398 }, "harness|truthfulqa:mc|0": { "mc1": 0.5667074663402693, "mc1_stderr": 0.017347024450107485, "mc2": 0.7150654372668831, "mc2_stderr": 0.0150179447040812 }, "harness|winogrande|5": { "acc": 0.835043409629045, "acc_stderr": 0.010430917468237431 }, "harness|gsm8k|5": { "acc": 0.6429112964366944, "acc_stderr": 0.013197931775445208 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_dddsaty__SOLAR-Instruct-ko-Adapter-Attach
[ "region:us" ]
2024-01-29T07:11:37+00:00
{"pretty_name": "Evaluation run of dddsaty/SOLAR-Instruct-ko-Adapter-Attach", "dataset_summary": "Dataset automatically created during the evaluation run of model [dddsaty/SOLAR-Instruct-ko-Adapter-Attach](https://huggingface.co/dddsaty/SOLAR-Instruct-ko-Adapter-Attach) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dddsaty__SOLAR-Instruct-ko-Adapter-Attach\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T07:15:19.407850](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__SOLAR-Instruct-ko-Adapter-Attach/blob/main/results_2024-01-29T07-15-19.407850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6645870295864068,\n \"acc_stderr\": 0.03171088413744625,\n \"acc_norm\": 0.665494569281768,\n \"acc_norm_stderr\": 0.03235701825353022,\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7150654372668831,\n \"mc2_stderr\": 0.0150179447040812\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7078271260705039,\n \"acc_stderr\": 0.004538319464111956,\n \"acc_norm\": 0.8819956184027086,\n \"acc_norm_stderr\": 0.003219539790500477\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361072,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361072\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236785,\n \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236785\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497593,\n \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.01428337804429642,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.01428337804429642\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n \"acc_stderr\": 0.01631237662921307,\n \"acc_norm\": 0.38994413407821227,\n \"acc_norm_stderr\": 0.01631237662921307\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816646,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816646\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.022779719088733396,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.022779719088733396\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595294,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595294\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7150654372668831,\n \"mc2_stderr\": 0.0150179447040812\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237431\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6429112964366944,\n \"acc_stderr\": 0.013197931775445208\n }\n}\n```", "repo_url": "https://huggingface.co/dddsaty/SOLAR-Instruct-ko-Adapter-Attach", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|arc:challenge|25_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|arc:challenge|25_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|gsm8k|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|gsm8k|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hellaswag|10_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hellaswag|10_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T07-09-21.307966.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T07-15-19.407850.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["**/details_harness|winogrande|5_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["**/details_harness|winogrande|5_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T07-15-19.407850.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T07_09_21.307966", "path": ["results_2024-01-29T07-09-21.307966.parquet"]}, {"split": "2024_01_29T07_15_19.407850", "path": ["results_2024-01-29T07-15-19.407850.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T07-15-19.407850.parquet"]}]}]}
2024-01-29T07:17:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of dddsaty/SOLAR-Instruct-ko-Adapter-Attach Dataset automatically created during the evaluation run of model dddsaty/SOLAR-Instruct-ko-Adapter-Attach on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T07:15:19.407850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of dddsaty/SOLAR-Instruct-ko-Adapter-Attach\n\n\n\nDataset automatically created during the evaluation run of model dddsaty/SOLAR-Instruct-ko-Adapter-Attach on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T07:15:19.407850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of dddsaty/SOLAR-Instruct-ko-Adapter-Attach\n\n\n\nDataset automatically created during the evaluation run of model dddsaty/SOLAR-Instruct-ko-Adapter-Attach on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T07:15:19.407850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7ccb6afafafea6fa08bff052dc8faa7c1a85183e
# Dataset Card for Evaluation run of cloudyu/Phoenix_DPO_60B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [cloudyu/Phoenix_DPO_60B](https://huggingface.co/cloudyu/Phoenix_DPO_60B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_cloudyu__Phoenix_DPO_60B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T07:18:26.493020](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Phoenix_DPO_60B/blob/main/results_2024-01-29T07-18-26.493020.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.773060355126592, "acc_stderr": 0.02788677122058737, "acc_norm": 0.7768613327892654, "acc_norm_stderr": 0.028421288853019304, "mc1": 0.47613219094247244, "mc1_stderr": 0.017483547156961574, "mc2": 0.6383841795252835, "mc2_stderr": 0.014733348951101679 }, "harness|arc:challenge|25": { "acc": 0.6877133105802048, "acc_stderr": 0.013542598541688067, "acc_norm": 0.71160409556314, "acc_norm_stderr": 0.013238394422428173 }, "harness|hellaswag|10": { "acc": 0.6528579964150567, "acc_stderr": 0.004750884401095161, "acc_norm": 0.8546106353316073, "acc_norm_stderr": 0.003517725787017736 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.725925925925926, "acc_stderr": 0.03853254836552003, "acc_norm": 0.725925925925926, "acc_norm_stderr": 0.03853254836552003 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.875, "acc_stderr": 0.026913523521537846, "acc_norm": 0.875, "acc_norm_stderr": 0.026913523521537846 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8037735849056604, "acc_stderr": 0.024442388131100817, "acc_norm": 0.8037735849056604, "acc_norm_stderr": 0.024442388131100817 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.9166666666666666, "acc_stderr": 0.023112508176051236, "acc_norm": 0.9166666666666666, "acc_norm_stderr": 0.023112508176051236 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.65, "acc_stderr": 0.04793724854411019, "acc_norm": 0.65, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956914, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956914 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7456647398843931, "acc_stderr": 0.0332055644308557, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.0332055644308557 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5196078431372549, "acc_stderr": 0.04971358884367406, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.04971358884367406 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.8, "acc_stderr": 0.026148818018424502, "acc_norm": 0.8, "acc_norm_stderr": 0.026148818018424502 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6228070175438597, "acc_stderr": 0.04559522141958216, "acc_norm": 0.6228070175438597, "acc_norm_stderr": 0.04559522141958216 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.8, "acc_stderr": 0.0333333333333333, "acc_norm": 0.8, "acc_norm_stderr": 0.0333333333333333 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.7328042328042328, "acc_stderr": 0.02278967314577657, "acc_norm": 0.7328042328042328, "acc_norm_stderr": 0.02278967314577657 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5873015873015873, "acc_stderr": 0.04403438954768176, "acc_norm": 0.5873015873015873, "acc_norm_stderr": 0.04403438954768176 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.9032258064516129, "acc_stderr": 0.016818943416345197, "acc_norm": 0.9032258064516129, "acc_norm_stderr": 0.016818943416345197 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6502463054187192, "acc_stderr": 0.03355400904969566, "acc_norm": 0.6502463054187192, "acc_norm_stderr": 0.03355400904969566 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066584, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066584 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.9393939393939394, "acc_stderr": 0.016999994927421606, "acc_norm": 0.9393939393939394, "acc_norm_stderr": 0.016999994927421606 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9689119170984456, "acc_stderr": 0.012525310625527033, "acc_norm": 0.9689119170984456, "acc_norm_stderr": 0.012525310625527033 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.8282051282051283, "acc_stderr": 0.01912490360342356, "acc_norm": 0.8282051282051283, "acc_norm_stderr": 0.01912490360342356 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.45555555555555555, "acc_stderr": 0.03036486250482443, "acc_norm": 0.45555555555555555, "acc_norm_stderr": 0.03036486250482443 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8571428571428571, "acc_stderr": 0.022730208119306535, "acc_norm": 0.8571428571428571, "acc_norm_stderr": 0.022730208119306535 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5099337748344371, "acc_stderr": 0.04081677107248437, "acc_norm": 0.5099337748344371, "acc_norm_stderr": 0.04081677107248437 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9284403669724771, "acc_stderr": 0.01105125524781546, "acc_norm": 0.9284403669724771, "acc_norm_stderr": 0.01105125524781546 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6574074074074074, "acc_stderr": 0.032365852526021574, "acc_norm": 0.6574074074074074, "acc_norm_stderr": 0.032365852526021574 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9264705882352942, "acc_stderr": 0.018318855850089674, "acc_norm": 0.9264705882352942, "acc_norm_stderr": 0.018318855850089674 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9156118143459916, "acc_stderr": 0.01809424711647332, "acc_norm": 0.9156118143459916, "acc_norm_stderr": 0.01809424711647332 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.8071748878923767, "acc_stderr": 0.026478240960489365, "acc_norm": 0.8071748878923767, "acc_norm_stderr": 0.026478240960489365 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8778625954198473, "acc_stderr": 0.028718776889342344, "acc_norm": 0.8778625954198473, "acc_norm_stderr": 0.028718776889342344 }, "harness|hendrycksTest-international_law|5": { "acc": 0.9090909090909091, "acc_stderr": 0.02624319405407388, "acc_norm": 0.9090909090909091, "acc_norm_stderr": 0.02624319405407388 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.9074074074074074, "acc_stderr": 0.02802188803860943, "acc_norm": 0.9074074074074074, "acc_norm_stderr": 0.02802188803860943 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8773006134969326, "acc_stderr": 0.025777328426978927, "acc_norm": 0.8773006134969326, "acc_norm_stderr": 0.025777328426978927 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6071428571428571, "acc_stderr": 0.046355501356099754, "acc_norm": 0.6071428571428571, "acc_norm_stderr": 0.046355501356099754 }, "harness|hendrycksTest-management|5": { "acc": 0.8932038834951457, "acc_stderr": 0.030581088928331356, "acc_norm": 0.8932038834951457, "acc_norm_stderr": 0.030581088928331356 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9401709401709402, "acc_stderr": 0.015537514263253855, "acc_norm": 0.9401709401709402, "acc_norm_stderr": 0.015537514263253855 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.9054916985951469, "acc_stderr": 0.01046101533819307, "acc_norm": 0.9054916985951469, "acc_norm_stderr": 0.01046101533819307 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.8236994219653179, "acc_stderr": 0.020516425672490714, "acc_norm": 0.8236994219653179, "acc_norm_stderr": 0.020516425672490714 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.794413407821229, "acc_stderr": 0.013516116210724202, "acc_norm": 0.794413407821229, "acc_norm_stderr": 0.013516116210724202 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.8562091503267973, "acc_stderr": 0.020091188936043693, "acc_norm": 0.8562091503267973, "acc_norm_stderr": 0.020091188936043693 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8231511254019293, "acc_stderr": 0.021670058885510796, "acc_norm": 0.8231511254019293, "acc_norm_stderr": 0.021670058885510796 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8703703703703703, "acc_stderr": 0.018689725721062072, "acc_norm": 0.8703703703703703, "acc_norm_stderr": 0.018689725721062072 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.6702127659574468, "acc_stderr": 0.028045946942042405, "acc_norm": 0.6702127659574468, "acc_norm_stderr": 0.028045946942042405 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.6075619295958279, "acc_stderr": 0.012471243669229104, "acc_norm": 0.6075619295958279, "acc_norm_stderr": 0.012471243669229104 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.8345588235294118, "acc_stderr": 0.02257177102549476, "acc_norm": 0.8345588235294118, "acc_norm_stderr": 0.02257177102549476 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.815359477124183, "acc_stderr": 0.015697029240757776, "acc_norm": 0.815359477124183, "acc_norm_stderr": 0.015697029240757776 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.8367346938775511, "acc_stderr": 0.02366169917709861, "acc_norm": 0.8367346938775511, "acc_norm_stderr": 0.02366169917709861 }, "harness|hendrycksTest-sociology|5": { "acc": 0.900497512437811, "acc_stderr": 0.02116621630465939, "acc_norm": 0.900497512437811, "acc_norm_stderr": 0.02116621630465939 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776334, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776334 }, "harness|hendrycksTest-virology|5": { "acc": 0.5843373493975904, "acc_stderr": 0.03836722176598053, "acc_norm": 0.5843373493975904, "acc_norm_stderr": 0.03836722176598053 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8947368421052632, "acc_stderr": 0.023537557657892547, "acc_norm": 0.8947368421052632, "acc_norm_stderr": 0.023537557657892547 }, "harness|truthfulqa:mc|0": { "mc1": 0.47613219094247244, "mc1_stderr": 0.017483547156961574, "mc2": 0.6383841795252835, "mc2_stderr": 0.014733348951101679 }, "harness|winogrande|5": { "acc": 0.8492501973164956, "acc_stderr": 0.010056094631479698 }, "harness|gsm8k|5": { "acc": 0.6982562547384382, "acc_stderr": 0.012643544762873353 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_cloudyu__Phoenix_DPO_60B
[ "region:us" ]
2024-01-29T07:20:39+00:00
{"pretty_name": "Evaluation run of cloudyu/Phoenix_DPO_60B", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Phoenix_DPO_60B](https://huggingface.co/cloudyu/Phoenix_DPO_60B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Phoenix_DPO_60B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T07:18:26.493020](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Phoenix_DPO_60B/blob/main/results_2024-01-29T07-18-26.493020.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.773060355126592,\n \"acc_stderr\": 0.02788677122058737,\n \"acc_norm\": 0.7768613327892654,\n \"acc_norm_stderr\": 0.028421288853019304,\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6383841795252835,\n \"mc2_stderr\": 0.014733348951101679\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688067,\n \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428173\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n \"acc_stderr\": 0.004750884401095161,\n \"acc_norm\": 0.8546106353316073,\n \"acc_norm_stderr\": 0.003517725787017736\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100817,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100817\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956914,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956914\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.026148818018424502,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.026148818018424502\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.0333333333333333,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.0333333333333333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7328042328042328,\n \"acc_stderr\": 0.02278967314577657,\n \"acc_norm\": 0.7328042328042328,\n \"acc_norm_stderr\": 0.02278967314577657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421606,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421606\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8282051282051283,\n \"acc_stderr\": 0.01912490360342356,\n \"acc_norm\": 0.8282051282051283,\n \"acc_norm_stderr\": 0.01912490360342356\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.022730208119306535,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.022730208119306535\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9284403669724771,\n \"acc_stderr\": 0.01105125524781546,\n \"acc_norm\": 0.9284403669724771,\n \"acc_norm_stderr\": 0.01105125524781546\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089674,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9156118143459916,\n \"acc_stderr\": 0.01809424711647332,\n \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.01809424711647332\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9074074074074074,\n \"acc_stderr\": 0.02802188803860943,\n \"acc_norm\": 0.9074074074074074,\n \"acc_norm_stderr\": 0.02802188803860943\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253855,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253855\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.01046101533819307,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.01046101533819307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490714,\n \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490714\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.794413407821229,\n \"acc_stderr\": 0.013516116210724202,\n \"acc_norm\": 0.794413407821229,\n \"acc_norm_stderr\": 0.013516116210724202\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043693,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043693\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n \"acc_stderr\": 0.021670058885510796,\n \"acc_norm\": 0.8231511254019293,\n \"acc_norm_stderr\": 0.021670058885510796\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062072,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062072\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6702127659574468,\n \"acc_stderr\": 0.028045946942042405,\n \"acc_norm\": 0.6702127659574468,\n \"acc_norm_stderr\": 0.028045946942042405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6075619295958279,\n \"acc_stderr\": 0.012471243669229104,\n \"acc_norm\": 0.6075619295958279,\n \"acc_norm_stderr\": 0.012471243669229104\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.02257177102549476,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.02257177102549476\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.015697029240757776,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.015697029240757776\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.02116621630465939,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.02116621630465939\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.023537557657892547,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.023537557657892547\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6383841795252835,\n \"mc2_stderr\": 0.014733348951101679\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479698\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.012643544762873353\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Phoenix_DPO_60B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|arc:challenge|25_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|gsm8k|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hellaswag|10_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T07-18-26.493020.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["**/details_harness|winogrande|5_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T07-18-26.493020.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T07_18_26.493020", "path": ["results_2024-01-29T07-18-26.493020.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T07-18-26.493020.parquet"]}]}]}
2024-01-29T07:21:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of cloudyu/Phoenix_DPO_60B Dataset automatically created during the evaluation run of model cloudyu/Phoenix_DPO_60B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T07:18:26.493020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of cloudyu/Phoenix_DPO_60B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Phoenix_DPO_60B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T07:18:26.493020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of cloudyu/Phoenix_DPO_60B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Phoenix_DPO_60B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T07:18:26.493020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f1c2ac900b3c71b83ae6ebd4ed12e49a85ab92cb
# Dataset Card for "word_init_disjoint_unique" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
boda/word_init_disjoint_unique
[ "region:us" ]
2024-01-29T07:44:41+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "labels", "dtype": "string"}, {"name": "clue", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2679064.0, "num_examples": 42793}, {"name": "test", "num_bytes": 844513.0, "num_examples": 13495}], "download_size": 2791004, "dataset_size": 3523577.0}}
2024-01-29T07:44:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for "word_init_disjoint_unique" More Information needed
[ "# Dataset Card for \"word_init_disjoint_unique\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"word_init_disjoint_unique\"\n\nMore Information needed" ]
4c34ab3f048555468da5b067c57d83751532e59c
# Dataset Card for "naive_random_unique" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
boda/naive_random_unique
[ "region:us" ]
2024-01-29T07:44:49+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "labels", "dtype": "string"}, {"name": "clue", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2994948.7214326323, "num_examples": 47844}, {"name": "test", "num_bytes": 528579.2785673678, "num_examples": 8444}], "download_size": 2797022, "dataset_size": 3523528.0}}
2024-01-29T07:44:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for "naive_random_unique" More Information needed
[ "# Dataset Card for \"naive_random_unique\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"naive_random_unique\"\n\nMore Information needed" ]
b1bd293b6fc15b41bc6f6c290ba071e95026cad3
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-tuned <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-tuned](https://huggingface.co/SCE/Mistral-7B-math-ia3-tuned) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T07:55:26.696001](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned/blob/main/results_2024-01-29T07-55-26.696001.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.597094516437562, "acc_stderr": 0.033396196017173016, "acc_norm": 0.6014163034201743, "acc_norm_stderr": 0.03407797923224814, "mc1": 0.40636474908200737, "mc1_stderr": 0.017193835812093893, "mc2": 0.5807124282513559, "mc2_stderr": 0.015370155281237467 }, "harness|arc:challenge|25": { "acc": 0.5273037542662116, "acc_stderr": 0.014589589101985998, "acc_norm": 0.5725255972696246, "acc_norm_stderr": 0.014456862944650647 }, "harness|hellaswag|10": { "acc": 0.6082453694483171, "acc_stderr": 0.004871447106554924, "acc_norm": 0.8079067914758016, "acc_norm_stderr": 0.003931408309245499 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849725, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849725 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6566037735849056, "acc_stderr": 0.02922452646912479, "acc_norm": 0.6566037735849056, "acc_norm_stderr": 0.02922452646912479 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6527777777777778, "acc_stderr": 0.039812405437178615, "acc_norm": 0.6527777777777778, "acc_norm_stderr": 0.039812405437178615 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5606936416184971, "acc_stderr": 0.037842719328874674, "acc_norm": 0.5606936416184971, "acc_norm_stderr": 0.037842719328874674 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768079, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768079 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5234042553191489, "acc_stderr": 0.03265019475033582, "acc_norm": 0.5234042553191489, "acc_norm_stderr": 0.03265019475033582 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36507936507936506, "acc_stderr": 0.024796060602699944, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.024796060602699944 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.043902592653775614, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.043902592653775614 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.026729499068349958, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.026729499068349958 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175008, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7272727272727273, "acc_stderr": 0.0347769116216366, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.0347769116216366 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7626262626262627, "acc_stderr": 0.030313710538198906, "acc_norm": 0.7626262626262627, "acc_norm_stderr": 0.030313710538198906 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.02578772318072387, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.02578772318072387 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5512820512820513, "acc_stderr": 0.025217315184846482, "acc_norm": 0.5512820512820513, "acc_norm_stderr": 0.025217315184846482 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.02911661760608301, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.02911661760608301 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6722689075630253, "acc_stderr": 0.03048991141767323, "acc_norm": 0.6722689075630253, "acc_norm_stderr": 0.03048991141767323 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7889908256880734, "acc_stderr": 0.01749392240411265, "acc_norm": 0.7889908256880734, "acc_norm_stderr": 0.01749392240411265 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4861111111111111, "acc_stderr": 0.03408655867977749, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.030587591351604246, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.729957805907173, "acc_stderr": 0.028900721906293433, "acc_norm": 0.729957805907173, "acc_norm_stderr": 0.028900721906293433 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.032521134899291884, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.032521134899291884 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7099236641221374, "acc_stderr": 0.03980066246467766, "acc_norm": 0.7099236641221374, "acc_norm_stderr": 0.03980066246467766 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7177914110429447, "acc_stderr": 0.03536117886664742, "acc_norm": 0.7177914110429447, "acc_norm_stderr": 0.03536117886664742 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8931623931623932, "acc_stderr": 0.020237149008990915, "acc_norm": 0.8931623931623932, "acc_norm_stderr": 0.020237149008990915 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.789272030651341, "acc_stderr": 0.014583812465862538, "acc_norm": 0.789272030651341, "acc_norm_stderr": 0.014583812465862538 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.653179190751445, "acc_stderr": 0.025624723994030454, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.025624723994030454 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.35307262569832404, "acc_stderr": 0.015984204545268565, "acc_norm": 0.35307262569832404, "acc_norm_stderr": 0.015984204545268565 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6633986928104575, "acc_stderr": 0.027057974624494382, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.027057974624494382 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6720257234726688, "acc_stderr": 0.026664410886937613, "acc_norm": 0.6720257234726688, "acc_norm_stderr": 0.026664410886937613 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6419753086419753, "acc_stderr": 0.026675611926037103, "acc_norm": 0.6419753086419753, "acc_norm_stderr": 0.026675611926037103 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4432624113475177, "acc_stderr": 0.029634838473766006, "acc_norm": 0.4432624113475177, "acc_norm_stderr": 0.029634838473766006 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.423728813559322, "acc_stderr": 0.012620785155885998, "acc_norm": 0.423728813559322, "acc_norm_stderr": 0.012620785155885998 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6323529411764706, "acc_stderr": 0.02928941340940319, "acc_norm": 0.6323529411764706, "acc_norm_stderr": 0.02928941340940319 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5931372549019608, "acc_stderr": 0.019873802005061177, "acc_norm": 0.5931372549019608, "acc_norm_stderr": 0.019873802005061177 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302505, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302505 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.029279567411065677, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.029279567411065677 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6716417910447762, "acc_stderr": 0.033206858897443244, "acc_norm": 0.6716417910447762, "acc_norm_stderr": 0.033206858897443244 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.463855421686747, "acc_stderr": 0.03882310850890593, "acc_norm": 0.463855421686747, "acc_norm_stderr": 0.03882310850890593 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.40636474908200737, "mc1_stderr": 0.017193835812093893, "mc2": 0.5807124282513559, "mc2_stderr": 0.015370155281237467 }, "harness|winogrande|5": { "acc": 0.7655880031570639, "acc_stderr": 0.011906130106237985 }, "harness|gsm8k|5": { "acc": 0.4184988627748294, "acc_stderr": 0.013588287284030866 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned
[ "region:us" ]
2024-01-29T07:57:47+00:00
{"pretty_name": "Evaluation run of SCE/Mistral-7B-math-ia3-tuned", "dataset_summary": "Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-tuned](https://huggingface.co/SCE/Mistral-7B-math-ia3-tuned) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T07:55:26.696001](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-tuned/blob/main/results_2024-01-29T07-55-26.696001.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.597094516437562,\n \"acc_stderr\": 0.033396196017173016,\n \"acc_norm\": 0.6014163034201743,\n \"acc_norm_stderr\": 0.03407797923224814,\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093893,\n \"mc2\": 0.5807124282513559,\n \"mc2_stderr\": 0.015370155281237467\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985998,\n \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650647\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6082453694483171,\n \"acc_stderr\": 0.004871447106554924,\n \"acc_norm\": 0.8079067914758016,\n \"acc_norm_stderr\": 0.003931408309245499\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849725,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849725\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699944,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699944\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293433,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293433\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990915,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990915\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n \"acc_stderr\": 0.014583812465862538,\n \"acc_norm\": 0.789272030651341,\n \"acc_norm_stderr\": 0.014583812465862538\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n \"acc_stderr\": 0.015984204545268565,\n \"acc_norm\": 0.35307262569832404,\n \"acc_norm_stderr\": 0.015984204545268565\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.026664410886937613,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.026664410886937613\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037103,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n \"acc_stderr\": 0.012620785155885998,\n \"acc_norm\": 0.423728813559322,\n \"acc_norm_stderr\": 0.012620785155885998\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n \"mc1_stderr\": 0.017193835812093893,\n \"mc2\": 0.5807124282513559,\n \"mc2_stderr\": 0.015370155281237467\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237985\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4184988627748294,\n \"acc_stderr\": 0.013588287284030866\n }\n}\n```", "repo_url": "https://huggingface.co/SCE/Mistral-7B-math-ia3-tuned", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|arc:challenge|25_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|gsm8k|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hellaswag|10_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["**/details_harness|winogrande|5_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T07-55-26.696001.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T07_55_26.696001", "path": ["results_2024-01-29T07-55-26.696001.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T07-55-26.696001.parquet"]}]}]}
2024-01-29T07:58:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-tuned Dataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-tuned on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T07:55:26.696001(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-tuned\n\n\n\nDataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-tuned on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T07:55:26.696001(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-tuned\n\n\n\nDataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-tuned on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T07:55:26.696001(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
41c711c6eda8879ab364a3d1b7cc22126d5ffeb2
# Dataset Card for "word_init_disjoint_half" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
boda/word_init_disjoint_half
[ "region:us" ]
2024-01-29T08:04:09+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "labels", "dtype": "string"}, {"name": "clue", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4163893.0, "num_examples": 69339}, {"name": "test", "num_bytes": 1306598.0, "num_examples": 21707}], "download_size": 4312817, "dataset_size": 5470491.0}}
2024-01-29T08:04:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for "word_init_disjoint_half" More Information needed
[ "# Dataset Card for \"word_init_disjoint_half\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"word_init_disjoint_half\"\n\nMore Information needed" ]
5a919c49d49b6a8ae02addfd88ef1a59589647c0
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned10 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-pruned10](https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned10", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T08:02:29.802840](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned10/blob/main/results_2024-01-29T08-02-29.802840.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6071140736745556, "acc_stderr": 0.03313148079500382, "acc_norm": 0.6116841088566581, "acc_norm_stderr": 0.03380225533253272, "mc1": 0.5299877600979193, "mc1_stderr": 0.01747199209169754, "mc2": 0.6816125161993237, "mc2_stderr": 0.015141567513812132 }, "harness|arc:challenge|25": { "acc": 0.5861774744027304, "acc_stderr": 0.014392730009221005, "acc_norm": 0.6313993174061433, "acc_norm_stderr": 0.014097810678042203 }, "harness|hellaswag|10": { "acc": 0.6624178450507867, "acc_stderr": 0.0047191878909480685, "acc_norm": 0.8471420035849433, "acc_norm_stderr": 0.0035911513232683456 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.04276349494376599, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.04276349494376599 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849723, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849723 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6716981132075471, "acc_stderr": 0.02890159361241178, "acc_norm": 0.6716981132075471, "acc_norm_stderr": 0.02890159361241178 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5664739884393064, "acc_stderr": 0.03778621079092056, "acc_norm": 0.5664739884393064, "acc_norm_stderr": 0.03778621079092056 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.39215686274509803, "acc_stderr": 0.04858083574266344, "acc_norm": 0.39215686274509803, "acc_norm_stderr": 0.04858083574266344 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5319148936170213, "acc_stderr": 0.03261936918467382, "acc_norm": 0.5319148936170213, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.42105263157894735, "acc_stderr": 0.046446020912223177, "acc_norm": 0.42105263157894735, "acc_norm_stderr": 0.046446020912223177 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6, "acc_stderr": 0.040824829046386284, "acc_norm": 0.6, "acc_norm_stderr": 0.040824829046386284 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.37037037037037035, "acc_stderr": 0.024870815251057093, "acc_norm": 0.37037037037037035, "acc_norm_stderr": 0.024870815251057093 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4126984126984127, "acc_stderr": 0.04403438954768177, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.04403438954768177 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6870967741935484, "acc_stderr": 0.02637756702864586, "acc_norm": 0.6870967741935484, "acc_norm_stderr": 0.02637756702864586 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885417, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885417 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.030532892233932022, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.030532892233932022 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8497409326424871, "acc_stderr": 0.02578772318072386, "acc_norm": 0.8497409326424871, "acc_norm_stderr": 0.02578772318072386 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5461538461538461, "acc_stderr": 0.025242770987126184, "acc_norm": 0.5461538461538461, "acc_norm_stderr": 0.025242770987126184 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.027840811495871934, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.027840811495871934 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.030684737115135363, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.030684737115135363 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7963302752293578, "acc_stderr": 0.017266742087630797, "acc_norm": 0.7963302752293578, "acc_norm_stderr": 0.017266742087630797 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.47685185185185186, "acc_stderr": 0.03406315360711507, "acc_norm": 0.47685185185185186, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.03019028245350195, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.02830465794303531, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.02830465794303531 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6233183856502242, "acc_stderr": 0.032521134899291884, "acc_norm": 0.6233183856502242, "acc_norm_stderr": 0.032521134899291884 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306085, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306085 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097653, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097653 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650743, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650743 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326466, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326466 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.02220930907316561, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.02220930907316561 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7816091954022989, "acc_stderr": 0.01477435831993449, "acc_norm": 0.7816091954022989, "acc_norm_stderr": 0.01477435831993449 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.684971098265896, "acc_stderr": 0.025009313790069727, "acc_norm": 0.684971098265896, "acc_norm_stderr": 0.025009313790069727 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3452513966480447, "acc_stderr": 0.015901432608930358, "acc_norm": 0.3452513966480447, "acc_norm_stderr": 0.015901432608930358 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6928104575163399, "acc_stderr": 0.02641560191438899, "acc_norm": 0.6928104575163399, "acc_norm_stderr": 0.02641560191438899 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464485, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464485 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6790123456790124, "acc_stderr": 0.02597656601086274, "acc_norm": 0.6790123456790124, "acc_norm_stderr": 0.02597656601086274 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46099290780141844, "acc_stderr": 0.02973659252642444, "acc_norm": 0.46099290780141844, "acc_norm_stderr": 0.02973659252642444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44002607561929596, "acc_stderr": 0.012678037478574513, "acc_norm": 0.44002607561929596, "acc_norm_stderr": 0.012678037478574513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6066176470588235, "acc_stderr": 0.029674288281311155, "acc_norm": 0.6066176470588235, "acc_norm_stderr": 0.029674288281311155 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.619281045751634, "acc_stderr": 0.019643801557924803, "acc_norm": 0.619281045751634, "acc_norm_stderr": 0.019643801557924803 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04265792110940589, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04265792110940589 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7061224489795919, "acc_stderr": 0.02916273841024977, "acc_norm": 0.7061224489795919, "acc_norm_stderr": 0.02916273841024977 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7661691542288557, "acc_stderr": 0.02992941540834839, "acc_norm": 0.7661691542288557, "acc_norm_stderr": 0.02992941540834839 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.5299877600979193, "mc1_stderr": 0.01747199209169754, "mc2": 0.6816125161993237, "mc2_stderr": 0.015141567513812132 }, "harness|winogrande|5": { "acc": 0.7734806629834254, "acc_stderr": 0.011764149054698338 }, "harness|gsm8k|5": { "acc": 0.40106141015921154, "acc_stderr": 0.013500158922245547 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned10
[ "region:us" ]
2024-01-29T08:04:46+00:00
{"pretty_name": "Evaluation run of SCE/Mistral-7B-math-ia3-pruned10", "dataset_summary": "Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-pruned10](https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned10\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T08:02:29.802840](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned10/blob/main/results_2024-01-29T08-02-29.802840.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6071140736745556,\n \"acc_stderr\": 0.03313148079500382,\n \"acc_norm\": 0.6116841088566581,\n \"acc_norm_stderr\": 0.03380225533253272,\n \"mc1\": 0.5299877600979193,\n \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6816125161993237,\n \"mc2_stderr\": 0.015141567513812132\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221005,\n \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.014097810678042203\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6624178450507867,\n \"acc_stderr\": 0.0047191878909480685,\n \"acc_norm\": 0.8471420035849433,\n \"acc_norm_stderr\": 0.0035911513232683456\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072386,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072386\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.025242770987126184,\n \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.025242770987126184\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871934,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871934\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630797,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630797\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303531,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303531\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n \"acc_stderr\": 0.01477435831993449,\n \"acc_norm\": 0.7816091954022989,\n \"acc_norm_stderr\": 0.01477435831993449\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069727,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069727\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n \"acc_stderr\": 0.015901432608930358,\n \"acc_norm\": 0.3452513966480447,\n \"acc_norm_stderr\": 0.015901432608930358\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438899,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438899\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.7661691542288557,\n \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5299877600979193,\n \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6816125161993237,\n \"mc2_stderr\": 0.015141567513812132\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40106141015921154,\n \"acc_stderr\": 0.013500158922245547\n }\n}\n```", "repo_url": "https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned10", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|arc:challenge|25_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|gsm8k|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hellaswag|10_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T08-02-29.802840.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["**/details_harness|winogrande|5_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T08-02-29.802840.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T08_02_29.802840", "path": ["results_2024-01-29T08-02-29.802840.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T08-02-29.802840.parquet"]}]}]}
2024-01-29T08:05:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned10 Dataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-pruned10 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T08:02:29.802840(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned10\n\n\n\nDataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-pruned10 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T08:02:29.802840(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned10\n\n\n\nDataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-pruned10 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T08:02:29.802840(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
de6f9dcf5c68799f1ec2a85f96a8b1ce7a9b9c59
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned20 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-pruned20](https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-29T08:07:52.412937](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20/blob/main/results_2024-01-29T08-07-52.412937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6055660350620623, "acc_stderr": 0.033190282510517935, "acc_norm": 0.6099875699478283, "acc_norm_stderr": 0.03385986318812193, "mc1": 0.5201958384332925, "mc1_stderr": 0.017489216849737053, "mc2": 0.6773630200722127, "mc2_stderr": 0.015189227668395784 }, "harness|arc:challenge|25": { "acc": 0.5819112627986348, "acc_stderr": 0.01441398839699608, "acc_norm": 0.6305460750853242, "acc_norm_stderr": 0.014104578366491888 }, "harness|hellaswag|10": { "acc": 0.6550487950607449, "acc_stderr": 0.004743808792037865, "acc_norm": 0.8441545508862777, "acc_norm_stderr": 0.003619674864035016 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6381578947368421, "acc_stderr": 0.03910525752849723, "acc_norm": 0.6381578947368421, "acc_norm_stderr": 0.03910525752849723 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6641509433962264, "acc_stderr": 0.02906722014664483, "acc_norm": 0.6641509433962264, "acc_norm_stderr": 0.02906722014664483 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.049888765156985884, "acc_norm": 0.44, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5838150289017341, "acc_stderr": 0.03758517775404947, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.03758517775404947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726367, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726367 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099834, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099834 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.38596491228070173, "acc_stderr": 0.04579639422070434, "acc_norm": 0.38596491228070173, "acc_norm_stderr": 0.04579639422070434 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.373015873015873, "acc_stderr": 0.02490699045899257, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.02490699045899257 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377563, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377563 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6645161290322581, "acc_stderr": 0.02686020644472435, "acc_norm": 0.6645161290322581, "acc_norm_stderr": 0.02686020644472435 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.03517603540361008, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.03517603540361008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.703030303030303, "acc_stderr": 0.0356796977226805, "acc_norm": 0.703030303030303, "acc_norm_stderr": 0.0356796977226805 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.030532892233932022, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.030532892233932022 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.02649905770139744, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.02649905770139744 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5461538461538461, "acc_stderr": 0.025242770987126184, "acc_norm": 0.5461538461538461, "acc_norm_stderr": 0.025242770987126184 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3, "acc_stderr": 0.027940457136228402, "acc_norm": 0.3, "acc_norm_stderr": 0.027940457136228402 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6554621848739496, "acc_stderr": 0.03086868260412163, "acc_norm": 0.6554621848739496, "acc_norm_stderr": 0.03086868260412163 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.038969819642573754, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.038969819642573754 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7926605504587156, "acc_stderr": 0.017381415563608674, "acc_norm": 0.7926605504587156, "acc_norm_stderr": 0.017381415563608674 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.44907407407407407, "acc_stderr": 0.03392238405321616, "acc_norm": 0.44907407407407407, "acc_norm_stderr": 0.03392238405321616 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7598039215686274, "acc_stderr": 0.02998373305591361, "acc_norm": 0.7598039215686274, "acc_norm_stderr": 0.02998373305591361 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7552742616033755, "acc_stderr": 0.027985699387036423, "acc_norm": 0.7552742616033755, "acc_norm_stderr": 0.027985699387036423 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6367713004484304, "acc_stderr": 0.03227790442850499, "acc_norm": 0.6367713004484304, "acc_norm_stderr": 0.03227790442850499 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306085, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306085 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7222222222222222, "acc_stderr": 0.04330043749650743, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.04330043749650743 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.03487825168497892, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7184466019417476, "acc_stderr": 0.04453254836326466, "acc_norm": 0.7184466019417476, "acc_norm_stderr": 0.04453254836326466 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.02250903393707778, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.02250903393707778 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7905491698595147, "acc_stderr": 0.0145513105681437, "acc_norm": 0.7905491698595147, "acc_norm_stderr": 0.0145513105681437 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436593, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436593 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3787709497206704, "acc_stderr": 0.016223533510365117, "acc_norm": 0.3787709497206704, "acc_norm_stderr": 0.016223533510365117 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6830065359477124, "acc_stderr": 0.026643278474508755, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.026643278474508755 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6816720257234726, "acc_stderr": 0.026457225067811025, "acc_norm": 0.6816720257234726, "acc_norm_stderr": 0.026457225067811025 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6882716049382716, "acc_stderr": 0.02577311116963045, "acc_norm": 0.6882716049382716, "acc_norm_stderr": 0.02577311116963045 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44132985658409385, "acc_stderr": 0.012682016335646666, "acc_norm": 0.44132985658409385, "acc_norm_stderr": 0.012682016335646666 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6286764705882353, "acc_stderr": 0.029349803139765873, "acc_norm": 0.6286764705882353, "acc_norm_stderr": 0.029349803139765873 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6127450980392157, "acc_stderr": 0.019706875804085644, "acc_norm": 0.6127450980392157, "acc_norm_stderr": 0.019706875804085644 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.02904308868330433, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.02904308868330433 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7761194029850746, "acc_stderr": 0.0294752502360172, "acc_norm": 0.7761194029850746, "acc_norm_stderr": 0.0294752502360172 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-virology|5": { "acc": 0.4939759036144578, "acc_stderr": 0.03892212195333045, "acc_norm": 0.4939759036144578, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5201958384332925, "mc1_stderr": 0.017489216849737053, "mc2": 0.6773630200722127, "mc2_stderr": 0.015189227668395784 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.011850040124850508 }, "harness|gsm8k|5": { "acc": 0.41925701288855194, "acc_stderr": 0.013591720959042115 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20
[ "region:us" ]
2024-01-29T08:10:10+00:00
{"pretty_name": "Evaluation run of SCE/Mistral-7B-math-ia3-pruned20", "dataset_summary": "Dataset automatically created during the evaluation run of model [SCE/Mistral-7B-math-ia3-pruned20](https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T08:07:52.412937](https://huggingface.co/datasets/open-llm-leaderboard/details_SCE__Mistral-7B-math-ia3-pruned20/blob/main/results_2024-01-29T08-07-52.412937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6055660350620623,\n \"acc_stderr\": 0.033190282510517935,\n \"acc_norm\": 0.6099875699478283,\n \"acc_norm_stderr\": 0.03385986318812193,\n \"mc1\": 0.5201958384332925,\n \"mc1_stderr\": 0.017489216849737053,\n \"mc2\": 0.6773630200722127,\n \"mc2_stderr\": 0.015189227668395784\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.01441398839699608,\n \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6550487950607449,\n \"acc_stderr\": 0.004743808792037865,\n \"acc_norm\": 0.8441545508862777,\n \"acc_norm_stderr\": 0.003619674864035016\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.025242770987126184,\n \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.025242770987126184\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.03086868260412163,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.03086868260412163\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707778,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707778\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n \"acc_stderr\": 0.0145513105681437,\n \"acc_norm\": 0.7905491698595147,\n \"acc_norm_stderr\": 0.0145513105681437\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n \"acc_stderr\": 0.016223533510365117,\n \"acc_norm\": 0.3787709497206704,\n \"acc_norm_stderr\": 0.016223533510365117\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n \"acc_stderr\": 0.012682016335646666,\n \"acc_norm\": 0.44132985658409385,\n \"acc_norm_stderr\": 0.012682016335646666\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085644,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.0294752502360172,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.0294752502360172\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5201958384332925,\n \"mc1_stderr\": 0.017489216849737053,\n \"mc2\": 0.6773630200722127,\n \"mc2_stderr\": 0.015189227668395784\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41925701288855194,\n \"acc_stderr\": 0.013591720959042115\n }\n}\n```", "repo_url": "https://huggingface.co/SCE/Mistral-7B-math-ia3-pruned20", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|arc:challenge|25_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|gsm8k|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hellaswag|10_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["**/details_harness|winogrande|5_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T08-07-52.412937.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_29T08_07_52.412937", "path": ["results_2024-01-29T08-07-52.412937.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T08-07-52.412937.parquet"]}]}]}
2024-01-29T08:10:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned20 Dataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-pruned20 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-01-29T08:07:52.412937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned20\n\n\n\nDataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-pruned20 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T08:07:52.412937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of SCE/Mistral-7B-math-ia3-pruned20\n\n\n\nDataset automatically created during the evaluation run of model SCE/Mistral-7B-math-ia3-pruned20 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-01-29T08:07:52.412937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
35b2ba6b53c88bc48300dbf58745985f2b18c358
# Dataset Card for CAS ## Dataset Description - **Homepage:** https://clementdalloux.fr/?page_id=28 - **Pubmed:** False - **Public:** False - **Tasks:** TXTCLASS We manually annotated two corpora from the biomedical field. The ESSAI corpus contains clinical trial protocols in French. They were mainly obtained from the National Cancer Institute The typical protocol consists of two parts: the summary of the trial, which indicates the purpose of the trial and the methods applied; and a detailed description of the trial with the inclusion and exclusion criteria. The CAS corpus contains clinical cases published in scientific literature and training material. They are published in different journals from French-speaking countries (France, Belgium, Switzerland, Canada, African countries, tropical countries) and are related to various medical specialties (cardiology, urology, oncology, obstetrics, pulmonology, gastro-enterology). The purpose of clinical cases is to describe clinical situations of patients. Hence, their content is close to the content of clinical narratives (description of diagnoses, treatments or procedures, evolution, family history, expected audience, etc.). In clinical cases, the negation is frequently used for describing the patient signs, symptoms, and diagnosis. Speculation is present as well but less frequently. This version only contain the annotated CAS corpus ## Citation Information ``` @inproceedings{grabar-etal-2018-cas, title = {{CAS}: {F}rench Corpus with Clinical Cases}, author = {Grabar, Natalia and Claveau, Vincent and Dalloux, Cl{'e}ment}, year = 2018, month = oct, booktitle = { Proceedings of the Ninth International Workshop on Health Text Mining and Information Analysis }, publisher = {Association for Computational Linguistics}, address = {Brussels, Belgium}, pages = {122--128}, doi = {10.18653/v1/W18-5614}, url = {https://aclanthology.org/W18-5614}, abstract = { Textual corpora are extremely important for various NLP applications as they provide information necessary for creating, setting and testing these applications and the corresponding tools. They are also crucial for designing reliable methods and reproducible results. Yet, in some areas, such as the medical area, due to confidentiality or to ethical reasons, it is complicated and even impossible to access textual data representative of those produced in these areas. We propose the CAS corpus built with clinical cases, such as they are reported in the published scientific literature in French. We describe this corpus, currently containing over 397,000 word occurrences, and the existing linguistic and semantic annotations. } } ```
asus-aics/cas
[ "multilinguality:monolingual", "language:fr", "license:other", "region:us" ]
2024-01-29T08:11:29+00:00
{"language": ["fr"], "license": "other", "multilinguality": "monolingual", "pretty_name": "CAS", "bigbio_language": ["French"], "bigbio_license_shortname": "DUA", "homepage": "https://clementdalloux.fr/?page_id=28", "bigbio_pubmed": false, "bigbio_public": false, "bigbio_tasks": ["TEXT_CLASSIFICATION"]}
2024-01-29T09:00:50+00:00
[]
[ "fr" ]
TAGS #multilinguality-monolingual #language-French #license-other #region-us
# Dataset Card for CAS ## Dataset Description - Homepage: URL - Pubmed: False - Public: False - Tasks: TXTCLASS We manually annotated two corpora from the biomedical field. The ESSAI corpus contains clinical trial protocols in French. They were mainly obtained from the National Cancer Institute The typical protocol consists of two parts: the summary of the trial, which indicates the purpose of the trial and the methods applied; and a detailed description of the trial with the inclusion and exclusion criteria. The CAS corpus contains clinical cases published in scientific literature and training material. They are published in different journals from French-speaking countries (France, Belgium, Switzerland, Canada, African countries, tropical countries) and are related to various medical specialties (cardiology, urology, oncology, obstetrics, pulmonology, gastro-enterology). The purpose of clinical cases is to describe clinical situations of patients. Hence, their content is close to the content of clinical narratives (description of diagnoses, treatments or procedures, evolution, family history, expected audience, etc.). In clinical cases, the negation is frequently used for describing the patient signs, symptoms, and diagnosis. Speculation is present as well but less frequently. This version only contain the annotated CAS corpus
[ "# Dataset Card for CAS", "## Dataset Description\n\n- Homepage: URL\n- Pubmed: False\n- Public: False\n- Tasks: TXTCLASS\n\n\nWe manually annotated two corpora from the biomedical field. The ESSAI corpus contains clinical trial protocols in French. They were mainly obtained from the National Cancer Institute The typical protocol consists of two parts: the summary of the trial, which indicates the purpose of the trial and the methods applied; and a detailed description of the trial with the inclusion and exclusion criteria. The CAS corpus contains clinical cases published in scientific literature and training material. They are published in different journals from French-speaking countries (France, Belgium, Switzerland, Canada, African countries, tropical countries) and are related to various medical specialties (cardiology, urology, oncology, obstetrics, pulmonology, gastro-enterology). The purpose of clinical cases is to describe clinical situations of patients. Hence, their content is close to the content of clinical narratives (description of diagnoses, treatments or procedures, evolution, family history, expected audience, etc.). In clinical cases, the negation is frequently used for describing the patient signs, symptoms, and diagnosis. Speculation is present as well but less frequently.\n\nThis version only contain the annotated CAS corpus" ]
[ "TAGS\n#multilinguality-monolingual #language-French #license-other #region-us \n", "# Dataset Card for CAS", "## Dataset Description\n\n- Homepage: URL\n- Pubmed: False\n- Public: False\n- Tasks: TXTCLASS\n\n\nWe manually annotated two corpora from the biomedical field. The ESSAI corpus contains clinical trial protocols in French. They were mainly obtained from the National Cancer Institute The typical protocol consists of two parts: the summary of the trial, which indicates the purpose of the trial and the methods applied; and a detailed description of the trial with the inclusion and exclusion criteria. The CAS corpus contains clinical cases published in scientific literature and training material. They are published in different journals from French-speaking countries (France, Belgium, Switzerland, Canada, African countries, tropical countries) and are related to various medical specialties (cardiology, urology, oncology, obstetrics, pulmonology, gastro-enterology). The purpose of clinical cases is to describe clinical situations of patients. Hence, their content is close to the content of clinical narratives (description of diagnoses, treatments or procedures, evolution, family history, expected audience, etc.). In clinical cases, the negation is frequently used for describing the patient signs, symptoms, and diagnosis. Speculation is present as well but less frequently.\n\nThis version only contain the annotated CAS corpus" ]