sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
c6ffd4ce49a9b79094de739a44b5623b6005c30d |
# Dataset of ethlin (Fire Emblem)
This is the dataset of ethlin (Fire Emblem), containing 44 images and their tags.
The core tags of this character are `pink_hair, long_hair, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 46.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 26.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 73 | 45.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 40.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 73 | 62.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ethlin_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, bare_shoulders, smile, jewelry, looking_at_viewer, sidelocks, bangs, detached_collar, full_body, holding, long_dress, parted_lips, shiny_hair, strapless_dress, purple_footwear, standing, transparent_background, upper_body |
| 1 | 23 |  |  |  |  |  | 1girl, cape, solo, smile, staff, open_mouth, boots, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | smile | jewelry | looking_at_viewer | sidelocks | bangs | detached_collar | full_body | holding | long_dress | parted_lips | shiny_hair | strapless_dress | purple_footwear | standing | transparent_background | upper_body | cape | staff | open_mouth | boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------|:----------|:--------------------|:------------|:--------|:------------------|:------------|:----------|:-------------|:--------------|:-------------|:------------------|:------------------|:-----------|:-------------------------|:-------------|:-------|:--------|:-------------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 1 | 23 |  |  |  |  |  | X | X | | X | | | | | | | X | | | | | | | | | X | X | X | X |
| CyberHarem/ethlin_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T07:36:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T07:46:02+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ethlin (Fire Emblem)
===============================
This is the dataset of ethlin (Fire Emblem), containing 44 images and their tags.
The core tags of this character are 'pink\_hair, long\_hair, pink\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6433b7b1b097af38f27919e07de8319d3e95ebc5 |
### Dataset Card for LangChain Issues
#### Dataset Summary
LangChain Issues is a dataset consisting of LangChain issues and pull requests associated with the LangChain repository (https://github.com/langchain-ai/langchain).
It is intended for educational purposes and can be used for semantic search or multilabel text classification.
The contents of each LangChain issue are in English and concern the domain of datasets for NLP, computer vision, and beyond. | delayedkarma/langchain-issues | [
"license:apache-2.0",
"region:us"
] | 2024-01-18T07:36:58+00:00 | {"license": "apache-2.0"} | 2024-01-18T07:45:25+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
### Dataset Card for LangChain Issues
#### Dataset Summary
LangChain Issues is a dataset consisting of LangChain issues and pull requests associated with the LangChain repository (URL
It is intended for educational purposes and can be used for semantic search or multilabel text classification.
The contents of each LangChain issue are in English and concern the domain of datasets for NLP, computer vision, and beyond. | [
"### Dataset Card for LangChain Issues",
"#### Dataset Summary\n\nLangChain Issues is a dataset consisting of LangChain issues and pull requests associated with the LangChain repository (URL \nIt is intended for educational purposes and can be used for semantic search or multilabel text classification. \nThe contents of each LangChain issue are in English and concern the domain of datasets for NLP, computer vision, and beyond."
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"### Dataset Card for LangChain Issues",
"#### Dataset Summary\n\nLangChain Issues is a dataset consisting of LangChain issues and pull requests associated with the LangChain repository (URL \nIt is intended for educational purposes and can be used for semantic search or multilabel text classification. \nThe contents of each LangChain issue are in English and concern the domain of datasets for NLP, computer vision, and beyond."
] |
7f7b6547fcba04da86d4b28661bcac387122a547 | # Dataset Card for "commonsense_generated_answers_2question"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | wisenut-nlp-team/commonsense_generated_answers_2question | [
"region:us"
] | 2024-01-18T07:37:50+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "context", "sequence": "string"}, {"name": "answer", "sequence": "string"}, {"name": "original_answer", "sequence": "string"}, {"name": "similar_contexts", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 2384105156, "num_examples": 91521}], "download_size": 1208138240, "dataset_size": 2384105156}} | 2024-01-18T07:39:47+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "commonsense_generated_answers_2question"
More Information needed | [
"# Dataset Card for \"commonsense_generated_answers_2question\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"commonsense_generated_answers_2question\"\n\nMore Information needed"
] |
97fcf7fbf71abb1243c2ef1997698cc4ed4f52ca |
# Dataset of thite (Fire Emblem)
This is the dataset of thite (Fire Emblem), containing 72 images and their tags.
The core tags of this character are `blue_hair, blue_eyes, short_hair, bangs, headband, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 72 | 81.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thite_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 72 | 54.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thite_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 139 | 97.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thite_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 72 | 74.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thite_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 139 | 125.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/thite_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/thite_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, armor, fingerless_gloves, pegasus_knight_uniform_(fire_emblem), skirt, solo, spear, thighhighs, thigh_boots, belt |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, hair_flower, solo, strapless_dress, white_dress, blue_flower, detached_sleeves, medium_breasts, rose, smile, wedding_dress, feathers, official_alternate_costume, simple_background, upper_body, blush, cleavage, detached_collar, holding, white_background |
| 2 | 9 |  |  |  |  |  | 1girl, detached_collar, feather_trim, medium_breasts, wedding_dress, white_dress, white_footwear, bare_shoulders, full_body, shiny_hair, simple_background, smile, strapless_dress, solo, white_background, hair_flower, skirt_hold, holding, looking_away, collarbone, high_heels, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armor | fingerless_gloves | pegasus_knight_uniform_(fire_emblem) | skirt | solo | spear | thighhighs | thigh_boots | belt | bare_shoulders | hair_flower | strapless_dress | white_dress | blue_flower | detached_sleeves | medium_breasts | rose | smile | wedding_dress | feathers | official_alternate_costume | simple_background | upper_body | blush | cleavage | detached_collar | holding | white_background | feather_trim | white_footwear | full_body | shiny_hair | skirt_hold | looking_away | collarbone | high_heels | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:---------------------------------------|:--------|:-------|:--------|:-------------|:--------------|:-------|:-----------------|:--------------|:------------------|:--------------|:--------------|:-------------------|:-----------------|:-------|:--------|:----------------|:-----------|:-----------------------------|:--------------------|:-------------|:--------|:-----------|:------------------|:----------|:-------------------|:---------------|:-----------------|:------------|:-------------|:-------------|:---------------|:-------------|:-------------|:--------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | | | | X | | | | | X | X | X | X | | | X | | X | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/thite_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T07:49:14+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:00:31+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of thite (Fire Emblem)
==============================
This is the dataset of thite (Fire Emblem), containing 72 images and their tags.
The core tags of this character are 'blue\_hair, blue\_eyes, short\_hair, bangs, headband, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6ef0630290f0d058b46ebe14432f5055424c0976 |
# Dataset of teeny (Fire Emblem)
This is the dataset of teeny (Fire Emblem), containing 52 images and their tags.
The core tags of this character are `long_hair, twintails, purple_eyes, purple_hair, multi-tied_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 52 | 43.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teeny_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 52 | 28.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teeny_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 86 | 49.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teeny_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 52 | 39.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teeny_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 86 | 66.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teeny_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/teeny_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, dress, looking_at_viewer, simple_background, solo, smile, white_background, closed_mouth, bare_shoulders, sleeveless, black_gloves, holding_book, jewelry |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | looking_at_viewer | simple_background | solo | smile | white_background | closed_mouth | bare_shoulders | sleeveless | black_gloves | holding_book | jewelry |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------------------|:-------|:--------|:-------------------|:---------------|:-----------------|:-------------|:---------------|:---------------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/teeny_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T07:49:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T07:58:12+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of teeny (Fire Emblem)
==============================
This is the dataset of teeny (Fire Emblem), containing 52 images and their tags.
The core tags of this character are 'long\_hair, twintails, purple\_eyes, purple\_hair, multi-tied\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1b87f2358b60e1c2a71f7ce16fb010378d52ae0f |
# Dataset of clea (Fire Emblem)
This is the dataset of clea (Fire Emblem), containing 36 images and their tags.
The core tags of this character are `blonde_hair, long_hair, ponytail, brown_eyes, breasts, medium_breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 36 | 40.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clea_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 36 | 24.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clea_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 73 | 46.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clea_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 36 | 36.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clea_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 73 | 62.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clea_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/clea_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, simple_background, solo, dress, full_body, helmet, pantyhose, smile, black_leggings, high_heels, looking_at_viewer, white_background, arms_behind_back, bridal_gauntlets, gloves, holding_weapon, open_mouth, shoulder_armor |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | simple_background | solo | dress | full_body | helmet | pantyhose | smile | black_leggings | high_heels | looking_at_viewer | white_background | arms_behind_back | bridal_gauntlets | gloves | holding_weapon | open_mouth | shoulder_armor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:------------|:---------|:------------|:--------|:-----------------|:-------------|:--------------------|:-------------------|:-------------------|:-------------------|:---------|:-----------------|:-------------|:-----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/clea_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T07:49:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T07:56:37+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of clea (Fire Emblem)
=============================
This is the dataset of clea (Fire Emblem), containing 36 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, ponytail, brown\_eyes, breasts, medium\_breasts, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5e42324c8fbaf25210d0a4f13c647aae79ec6832 | # Dataset Card for "covost2_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Codec-SUPERB/covost2_synth | [
"region:us"
] | 2024-01-18T07:54:55+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "original", "path": "data/original-*"}, {"split": "academicodec_hifi_16k_320d", "path": "data/academicodec_hifi_16k_320d-*"}, {"split": "academicodec_hifi_16k_320d_large_uni", "path": "data/academicodec_hifi_16k_320d_large_uni-*"}, {"split": "academicodec_hifi_24k_320d", "path": "data/academicodec_hifi_24k_320d-*"}, {"split": "audiodec_24k_320d", "path": "data/audiodec_24k_320d-*"}, {"split": "dac_16k", "path": "data/dac_16k-*"}, {"split": "dac_24k", "path": "data/dac_24k-*"}, {"split": "dac_44k", "path": "data/dac_44k-*"}, {"split": "encodec_24k", "path": "data/encodec_24k-*"}, {"split": "funcodec_en_libritts_16k_gr1nq32ds320", "path": "data/funcodec_en_libritts_16k_gr1nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_gr8nq32ds320", "path": "data/funcodec_en_libritts_16k_gr8nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds320", "path": "data/funcodec_en_libritts_16k_nq32ds320-*"}, {"split": "funcodec_en_libritts_16k_nq32ds640", "path": "data/funcodec_en_libritts_16k_nq32ds640-*"}, {"split": "funcodec_zh_en_16k_nq32ds320", "path": "data/funcodec_zh_en_16k_nq32ds320-*"}, {"split": "funcodec_zh_en_16k_nq32ds640", "path": "data/funcodec_zh_en_16k_nq32ds640-*"}, {"split": "speech_tokenizer_16k", "path": "data/speech_tokenizer_16k-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}], "splits": [{"name": "original", "num_bytes": 4043997731.972, "num_examples": 23778}, {"name": "academicodec_hifi_16k_320d", "num_bytes": 3952554603.408, "num_examples": 23778}, {"name": "academicodec_hifi_16k_320d_large_uni", "num_bytes": 3952554603.408, "num_examples": 23778}, {"name": "academicodec_hifi_24k_320d", "num_bytes": 5923762216.848, "num_examples": 23778}, {"name": "audiodec_24k_320d", "num_bytes": 5930095724.928, "num_examples": 23778}, {"name": "dac_16k", "num_bytes": 3954367438.128, "num_examples": 23778}, {"name": "dac_24k", "num_bytes": 5930095724.928, "num_examples": 23778}, {"name": "dac_44k", "num_bytes": 10894130391.564, "num_examples": 23778}, {"name": "encodec_24k", "num_bytes": 5930140332.456, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_gr1nq32ds320", "num_bytes": 3953549665.152, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_gr8nq32ds320", "num_bytes": 3953549665.152, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_nq32ds320", "num_bytes": 3953549665.152, "num_examples": 23778}, {"name": "funcodec_en_libritts_16k_nq32ds640", "num_bytes": 3953549665.152, "num_examples": 23778}, {"name": "funcodec_zh_en_16k_nq32ds320", "num_bytes": 3953549665.152, "num_examples": 23778}, {"name": "funcodec_zh_en_16k_nq32ds640", "num_bytes": 3953549665.152, "num_examples": 23778}, {"name": "speech_tokenizer_16k", "num_bytes": 3964348491.408, "num_examples": 23778}], "download_size": 72248199219, "dataset_size": 78197345249.96}} | 2024-01-18T13:58:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "covost2_synth"
More Information needed | [
"# Dataset Card for \"covost2_synth\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"covost2_synth\"\n\nMore Information needed"
] |
ba796d2931e03fa507bf9e747f849a37d4314afc | # Dataset Card for Dataset Name
A dataset of 100 corporate sustainability reports with manually extracted scope 1, 2 and 3 greenhouse gas emission values.
## Dataset Details
### Dataset Description
Data about corporate greenhouse gas emissions is usually published only as part of sustainability report PDF's, which is not a machine-readable format. Interested actors have to manually extract emission data from these reports, which is a tedious and time-consuming process. An automatic information-extraction system could solve this issue.
To evaluate such information-extraction systems and to encourage research into solving this task, a dataset of sustainability reports and manually-extracted emission values is created and published.
While this dataset is intended for evaluation, the accompanying [sustainability-report-emissions](https://huggingface.co/datasets/nopperl/sustainability-report-emissions) dataset is intended for training/finetuning models.
- **License:** Open Data Commons Public Domain Dedication and License (PDDL)
### Dataset Sources [optional]
- **Repository:** https://github.com/nopperl/corporate_emission_reports
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
The dataset is intended to be used to evaluate automatic systems to extract machine-readable greenhouse gas emission data from sustainability reports.
## Dataset Structure
- `id` (string): unique instance id, ex. `"0012"`.
- `emission_year` (int): the year of the extracted emissions, which is useful in reports which contain information about multiple years.
- `scope_1` (double): total scope 1 emissions in metric tonnes of CO2eq.
- `scope_2_market` (double): total market-based scope 2 emissions in metric tonnes of CO2eq.
- `scope_2_location` (double): total location-based scope 2 emissions in metric tonnes of CO2eq.
- `scope_3` (double): total scope 3 emissions in metric tonnes of CO2eq.
- `scope_1_page` (list<int>): set of pages containing total scope 1 emission data.
- `scope_2_market_page` (list<int>): set of pages containing total market-based scope 2 emission data.
- `scope_2_location_page` (list<int>): set of pages containing total location-based scope 2 emission data.
- `scope_3_page` (list<int>): set of pages containing total scope 1 emission data.
- `url` (string): the URL to the sustainability report PDF.
- `sha256` (string): SHA-256 hash string of the report PDF to ensure the integrity of downloaded files.
- `subset` (string): indication of whether the report comes from the set of Euro Stoxx 50 (`eurostoxx`), NYSE (`nyse`) or Nikkei 225 (`tyo`) corporations.
The remaining 15 fields contain the data for each of the [15 scope 3 emission categories](https://ghgprotocol.org/scope-3-calculation-guidance-2).
The dataset only contains the URL to the report PDF. A helper script to download these files is provided at: https://github.com/nopperl/corporate_emission_reports/blob/main/download_documents.py.
## Dataset Creation
### Curation Rationale
To our knowledge, there is no publicly-available dataset containing manually extracted (self-reported) greenhouse gas emission data from sustainability reports. Hence, this dataset was collected to enable the evaluation of automatic information-extraction systems.
### Source Data
The dataset is based on sustainability reports from corporations in Europe, North America and Asia.
#### Data Collection and Processing
To ensure geographic diversity, the sustainability reports are sourced from three sets of corporations. The first set consists of 39 corporations tracked by the Euro Stoxx 50 stock index as of 18 September 2023. Note that out of the missing 11 corporations of Euro Stoxx 50, 5 are not considered as no publicly-available sustainability reports were found and 6 are not considered it was not possible to extract unambiguous emission values. The second set is a random selection of 39 corporations listed on the New York Stock Exchange as of December 2023. The third set is a random selection of 22 corporations tracked by the Nikkei 225 index as of October 2023. Note, that this selection strategy is intentionally biased towards larger corporations due to the assumption that they have a higher likelihood of publishing sustainability reports.
For every corporation, the most recent sustainability report is downloaded from the official source. In some cases, the sustainability report is part of a larger annual report.
Sustainability reports based on the [Carbon Disclosure Project](https://www.cdp.net/en/guidance) or [Global Reporting Initiative](https://www.globalreporting.org/standards/) templates are not considered as they already follow a consistent structure.
#### Who are the source data producers?
The sustainability reports are produced by corporations themselves and optionally verified by third parties. Thus, they only contain self-reported emission information.
### Annotations [optional]
The sustainability reports are annotated with manually-extracted emission data, which forms the main purpose of this dataset.
#### Annotation process
The annotation was based on the greenhouse gas emission definitions of the [GHG Protocol Corporate Standard](https://ghgprotocol.org/corporate-standard):
- Scope 1: A reporting organization’s direct GHG emissions.
- Scope 2: A reporting organization’s emissions associated with the generation of electricity, heating/cooling, or steam purchased for own consumption.
- Scope 3: A reporting organization’s indirect emissions other than those covered in scope 2.
Only emission data about the reporting corporation was extracted, invidiual values for subsidiaries were ignored.
Scope 2 emissions are annoted as market-based by default if no indication about the calculation method is given in the report.
Values which could not be unambiguously extracted were noted as missing.
No automatic tools were used in the extraction process.
The extracted data is not validated by a third party or verified against other data sources.
#### Who are the annotators?
The data was annotated by a single person without special expertise in sustainability reporting.
#### Personal and Sensitive Information
The dataset contains only public information.
## Bias, Risks, and Limitations
The emission information was extracted by a single non-expert. No guarantee can be given that the data is completely correct.
The dataset does not contain sustainability reports of small enterprises or non-profit organisations.
Even though some care was taken to ensure geographic diversity, the dataset does not include sustainability reports from large parts of the world.
## Citation [optional]
**BibTeX:**
[More Information Needed]
| nopperl/corporate-emission-reports | [
"size_categories:n<1K",
"language:en",
"license:pddl",
"climate",
"region:us"
] | 2024-01-18T08:02:10+00:00 | {"language": ["en"], "license": "pddl", "size_categories": ["n<1K"], "tags": ["climate"], "dataset_info": {"features": [{"name": "scope_1", "dtype": "double"}, {"name": "scope_2_market", "dtype": "double"}, {"name": "scope_2_location", "dtype": "double"}, {"name": "scope_3", "dtype": "double"}]}} | 2024-02-03T20:13:54+00:00 | [] | [
"en"
] | TAGS
#size_categories-n<1K #language-English #license-pddl #climate #region-us
| # Dataset Card for Dataset Name
A dataset of 100 corporate sustainability reports with manually extracted scope 1, 2 and 3 greenhouse gas emission values.
## Dataset Details
### Dataset Description
Data about corporate greenhouse gas emissions is usually published only as part of sustainability report PDF's, which is not a machine-readable format. Interested actors have to manually extract emission data from these reports, which is a tedious and time-consuming process. An automatic information-extraction system could solve this issue.
To evaluate such information-extraction systems and to encourage research into solving this task, a dataset of sustainability reports and manually-extracted emission values is created and published.
While this dataset is intended for evaluation, the accompanying sustainability-report-emissions dataset is intended for training/finetuning models.
- License: Open Data Commons Public Domain Dedication and License (PDDL)
### Dataset Sources [optional]
- Repository: URL
- Paper [optional]:
- Demo [optional]:
## Uses
The dataset is intended to be used to evaluate automatic systems to extract machine-readable greenhouse gas emission data from sustainability reports.
## Dataset Structure
- 'id' (string): unique instance id, ex. '"0012"'.
- 'emission_year' (int): the year of the extracted emissions, which is useful in reports which contain information about multiple years.
- 'scope_1' (double): total scope 1 emissions in metric tonnes of CO2eq.
- 'scope_2_market' (double): total market-based scope 2 emissions in metric tonnes of CO2eq.
- 'scope_2_location' (double): total location-based scope 2 emissions in metric tonnes of CO2eq.
- 'scope_3' (double): total scope 3 emissions in metric tonnes of CO2eq.
- 'scope_1_page' (list<int>): set of pages containing total scope 1 emission data.
- 'scope_2_market_page' (list<int>): set of pages containing total market-based scope 2 emission data.
- 'scope_2_location_page' (list<int>): set of pages containing total location-based scope 2 emission data.
- 'scope_3_page' (list<int>): set of pages containing total scope 1 emission data.
- 'url' (string): the URL to the sustainability report PDF.
- 'sha256' (string): SHA-256 hash string of the report PDF to ensure the integrity of downloaded files.
- 'subset' (string): indication of whether the report comes from the set of Euro Stoxx 50 ('eurostoxx'), NYSE ('nyse') or Nikkei 225 ('tyo') corporations.
The remaining 15 fields contain the data for each of the 15 scope 3 emission categories.
The dataset only contains the URL to the report PDF. A helper script to download these files is provided at: URL
## Dataset Creation
### Curation Rationale
To our knowledge, there is no publicly-available dataset containing manually extracted (self-reported) greenhouse gas emission data from sustainability reports. Hence, this dataset was collected to enable the evaluation of automatic information-extraction systems.
### Source Data
The dataset is based on sustainability reports from corporations in Europe, North America and Asia.
#### Data Collection and Processing
To ensure geographic diversity, the sustainability reports are sourced from three sets of corporations. The first set consists of 39 corporations tracked by the Euro Stoxx 50 stock index as of 18 September 2023. Note that out of the missing 11 corporations of Euro Stoxx 50, 5 are not considered as no publicly-available sustainability reports were found and 6 are not considered it was not possible to extract unambiguous emission values. The second set is a random selection of 39 corporations listed on the New York Stock Exchange as of December 2023. The third set is a random selection of 22 corporations tracked by the Nikkei 225 index as of October 2023. Note, that this selection strategy is intentionally biased towards larger corporations due to the assumption that they have a higher likelihood of publishing sustainability reports.
For every corporation, the most recent sustainability report is downloaded from the official source. In some cases, the sustainability report is part of a larger annual report.
Sustainability reports based on the Carbon Disclosure Project or Global Reporting Initiative templates are not considered as they already follow a consistent structure.
#### Who are the source data producers?
The sustainability reports are produced by corporations themselves and optionally verified by third parties. Thus, they only contain self-reported emission information.
### Annotations [optional]
The sustainability reports are annotated with manually-extracted emission data, which forms the main purpose of this dataset.
#### Annotation process
The annotation was based on the greenhouse gas emission definitions of the GHG Protocol Corporate Standard:
- Scope 1: A reporting organization’s direct GHG emissions.
- Scope 2: A reporting organization’s emissions associated with the generation of electricity, heating/cooling, or steam purchased for own consumption.
- Scope 3: A reporting organization’s indirect emissions other than those covered in scope 2.
Only emission data about the reporting corporation was extracted, invidiual values for subsidiaries were ignored.
Scope 2 emissions are annoted as market-based by default if no indication about the calculation method is given in the report.
Values which could not be unambiguously extracted were noted as missing.
No automatic tools were used in the extraction process.
The extracted data is not validated by a third party or verified against other data sources.
#### Who are the annotators?
The data was annotated by a single person without special expertise in sustainability reporting.
#### Personal and Sensitive Information
The dataset contains only public information.
## Bias, Risks, and Limitations
The emission information was extracted by a single non-expert. No guarantee can be given that the data is completely correct.
The dataset does not contain sustainability reports of small enterprises or non-profit organisations.
Even though some care was taken to ensure geographic diversity, the dataset does not include sustainability reports from large parts of the world.
[optional]
BibTeX:
| [
"# Dataset Card for Dataset Name\n\nA dataset of 100 corporate sustainability reports with manually extracted scope 1, 2 and 3 greenhouse gas emission values.",
"## Dataset Details",
"### Dataset Description\n\nData about corporate greenhouse gas emissions is usually published only as part of sustainability report PDF's, which is not a machine-readable format. Interested actors have to manually extract emission data from these reports, which is a tedious and time-consuming process. An automatic information-extraction system could solve this issue.\n\nTo evaluate such information-extraction systems and to encourage research into solving this task, a dataset of sustainability reports and manually-extracted emission values is created and published.\n\nWhile this dataset is intended for evaluation, the accompanying sustainability-report-emissions dataset is intended for training/finetuning models.\n\n- License: Open Data Commons Public Domain Dedication and License (PDDL)",
"### Dataset Sources [optional]\n\n- Repository: URL\n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\nThe dataset is intended to be used to evaluate automatic systems to extract machine-readable greenhouse gas emission data from sustainability reports.",
"## Dataset Structure\n\n- 'id' (string): unique instance id, ex. '\"0012\"'.\n- 'emission_year' (int): the year of the extracted emissions, which is useful in reports which contain information about multiple years.\n- 'scope_1' (double): total scope 1 emissions in metric tonnes of CO2eq.\n- 'scope_2_market' (double): total market-based scope 2 emissions in metric tonnes of CO2eq.\n- 'scope_2_location' (double): total location-based scope 2 emissions in metric tonnes of CO2eq.\n- 'scope_3' (double): total scope 3 emissions in metric tonnes of CO2eq.\n- 'scope_1_page' (list<int>): set of pages containing total scope 1 emission data.\n- 'scope_2_market_page' (list<int>): set of pages containing total market-based scope 2 emission data.\n- 'scope_2_location_page' (list<int>): set of pages containing total location-based scope 2 emission data.\n- 'scope_3_page' (list<int>): set of pages containing total scope 1 emission data.\n- 'url' (string): the URL to the sustainability report PDF.\n- 'sha256' (string): SHA-256 hash string of the report PDF to ensure the integrity of downloaded files.\n- 'subset' (string): indication of whether the report comes from the set of Euro Stoxx 50 ('eurostoxx'), NYSE ('nyse') or Nikkei 225 ('tyo') corporations.\n\nThe remaining 15 fields contain the data for each of the 15 scope 3 emission categories.\n\nThe dataset only contains the URL to the report PDF. A helper script to download these files is provided at: URL",
"## Dataset Creation",
"### Curation Rationale\n\nTo our knowledge, there is no publicly-available dataset containing manually extracted (self-reported) greenhouse gas emission data from sustainability reports. Hence, this dataset was collected to enable the evaluation of automatic information-extraction systems.",
"### Source Data\n\nThe dataset is based on sustainability reports from corporations in Europe, North America and Asia.",
"#### Data Collection and Processing\n\nTo ensure geographic diversity, the sustainability reports are sourced from three sets of corporations. The first set consists of 39 corporations tracked by the Euro Stoxx 50 stock index as of 18 September 2023. Note that out of the missing 11 corporations of Euro Stoxx 50, 5 are not considered as no publicly-available sustainability reports were found and 6 are not considered it was not possible to extract unambiguous emission values. The second set is a random selection of 39 corporations listed on the New York Stock Exchange as of December 2023. The third set is a random selection of 22 corporations tracked by the Nikkei 225 index as of October 2023. Note, that this selection strategy is intentionally biased towards larger corporations due to the assumption that they have a higher likelihood of publishing sustainability reports.\n\nFor every corporation, the most recent sustainability report is downloaded from the official source. In some cases, the sustainability report is part of a larger annual report.\n\nSustainability reports based on the Carbon Disclosure Project or Global Reporting Initiative templates are not considered as they already follow a consistent structure.",
"#### Who are the source data producers?\n\nThe sustainability reports are produced by corporations themselves and optionally verified by third parties. Thus, they only contain self-reported emission information.",
"### Annotations [optional]\n\nThe sustainability reports are annotated with manually-extracted emission data, which forms the main purpose of this dataset.",
"#### Annotation process\n\nThe annotation was based on the greenhouse gas emission definitions of the GHG Protocol Corporate Standard:\n\n- Scope 1: A reporting organization’s direct GHG emissions.\n- Scope 2: A reporting organization’s emissions associated with the generation of electricity, heating/cooling, or steam purchased for own consumption.\n- Scope 3: A reporting organization’s indirect emissions other than those covered in scope 2.\n\nOnly emission data about the reporting corporation was extracted, invidiual values for subsidiaries were ignored.\n\nScope 2 emissions are annoted as market-based by default if no indication about the calculation method is given in the report.\n\nValues which could not be unambiguously extracted were noted as missing.\n\nNo automatic tools were used in the extraction process.\n\nThe extracted data is not validated by a third party or verified against other data sources.",
"#### Who are the annotators?\n\nThe data was annotated by a single person without special expertise in sustainability reporting.",
"#### Personal and Sensitive Information\n\nThe dataset contains only public information.",
"## Bias, Risks, and Limitations\n\nThe emission information was extracted by a single non-expert. No guarantee can be given that the data is completely correct.\n\nThe dataset does not contain sustainability reports of small enterprises or non-profit organisations.\n\nEven though some care was taken to ensure geographic diversity, the dataset does not include sustainability reports from large parts of the world.\n\n[optional]\n\nBibTeX:"
] | [
"TAGS\n#size_categories-n<1K #language-English #license-pddl #climate #region-us \n",
"# Dataset Card for Dataset Name\n\nA dataset of 100 corporate sustainability reports with manually extracted scope 1, 2 and 3 greenhouse gas emission values.",
"## Dataset Details",
"### Dataset Description\n\nData about corporate greenhouse gas emissions is usually published only as part of sustainability report PDF's, which is not a machine-readable format. Interested actors have to manually extract emission data from these reports, which is a tedious and time-consuming process. An automatic information-extraction system could solve this issue.\n\nTo evaluate such information-extraction systems and to encourage research into solving this task, a dataset of sustainability reports and manually-extracted emission values is created and published.\n\nWhile this dataset is intended for evaluation, the accompanying sustainability-report-emissions dataset is intended for training/finetuning models.\n\n- License: Open Data Commons Public Domain Dedication and License (PDDL)",
"### Dataset Sources [optional]\n\n- Repository: URL\n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\nThe dataset is intended to be used to evaluate automatic systems to extract machine-readable greenhouse gas emission data from sustainability reports.",
"## Dataset Structure\n\n- 'id' (string): unique instance id, ex. '\"0012\"'.\n- 'emission_year' (int): the year of the extracted emissions, which is useful in reports which contain information about multiple years.\n- 'scope_1' (double): total scope 1 emissions in metric tonnes of CO2eq.\n- 'scope_2_market' (double): total market-based scope 2 emissions in metric tonnes of CO2eq.\n- 'scope_2_location' (double): total location-based scope 2 emissions in metric tonnes of CO2eq.\n- 'scope_3' (double): total scope 3 emissions in metric tonnes of CO2eq.\n- 'scope_1_page' (list<int>): set of pages containing total scope 1 emission data.\n- 'scope_2_market_page' (list<int>): set of pages containing total market-based scope 2 emission data.\n- 'scope_2_location_page' (list<int>): set of pages containing total location-based scope 2 emission data.\n- 'scope_3_page' (list<int>): set of pages containing total scope 1 emission data.\n- 'url' (string): the URL to the sustainability report PDF.\n- 'sha256' (string): SHA-256 hash string of the report PDF to ensure the integrity of downloaded files.\n- 'subset' (string): indication of whether the report comes from the set of Euro Stoxx 50 ('eurostoxx'), NYSE ('nyse') or Nikkei 225 ('tyo') corporations.\n\nThe remaining 15 fields contain the data for each of the 15 scope 3 emission categories.\n\nThe dataset only contains the URL to the report PDF. A helper script to download these files is provided at: URL",
"## Dataset Creation",
"### Curation Rationale\n\nTo our knowledge, there is no publicly-available dataset containing manually extracted (self-reported) greenhouse gas emission data from sustainability reports. Hence, this dataset was collected to enable the evaluation of automatic information-extraction systems.",
"### Source Data\n\nThe dataset is based on sustainability reports from corporations in Europe, North America and Asia.",
"#### Data Collection and Processing\n\nTo ensure geographic diversity, the sustainability reports are sourced from three sets of corporations. The first set consists of 39 corporations tracked by the Euro Stoxx 50 stock index as of 18 September 2023. Note that out of the missing 11 corporations of Euro Stoxx 50, 5 are not considered as no publicly-available sustainability reports were found and 6 are not considered it was not possible to extract unambiguous emission values. The second set is a random selection of 39 corporations listed on the New York Stock Exchange as of December 2023. The third set is a random selection of 22 corporations tracked by the Nikkei 225 index as of October 2023. Note, that this selection strategy is intentionally biased towards larger corporations due to the assumption that they have a higher likelihood of publishing sustainability reports.\n\nFor every corporation, the most recent sustainability report is downloaded from the official source. In some cases, the sustainability report is part of a larger annual report.\n\nSustainability reports based on the Carbon Disclosure Project or Global Reporting Initiative templates are not considered as they already follow a consistent structure.",
"#### Who are the source data producers?\n\nThe sustainability reports are produced by corporations themselves and optionally verified by third parties. Thus, they only contain self-reported emission information.",
"### Annotations [optional]\n\nThe sustainability reports are annotated with manually-extracted emission data, which forms the main purpose of this dataset.",
"#### Annotation process\n\nThe annotation was based on the greenhouse gas emission definitions of the GHG Protocol Corporate Standard:\n\n- Scope 1: A reporting organization’s direct GHG emissions.\n- Scope 2: A reporting organization’s emissions associated with the generation of electricity, heating/cooling, or steam purchased for own consumption.\n- Scope 3: A reporting organization’s indirect emissions other than those covered in scope 2.\n\nOnly emission data about the reporting corporation was extracted, invidiual values for subsidiaries were ignored.\n\nScope 2 emissions are annoted as market-based by default if no indication about the calculation method is given in the report.\n\nValues which could not be unambiguously extracted were noted as missing.\n\nNo automatic tools were used in the extraction process.\n\nThe extracted data is not validated by a third party or verified against other data sources.",
"#### Who are the annotators?\n\nThe data was annotated by a single person without special expertise in sustainability reporting.",
"#### Personal and Sensitive Information\n\nThe dataset contains only public information.",
"## Bias, Risks, and Limitations\n\nThe emission information was extracted by a single non-expert. No guarantee can be given that the data is completely correct.\n\nThe dataset does not contain sustainability reports of small enterprises or non-profit organisations.\n\nEven though some care was taken to ensure geographic diversity, the dataset does not include sustainability reports from large parts of the world.\n\n[optional]\n\nBibTeX:"
] |
2c570250452329ffcaff37c095ac1a00dc8fb23d |
# Dataset of sheema (Fire Emblem)
This is the dataset of sheema (Fire Emblem), containing 16 images and their tags.
The core tags of this character are `brown_hair, long_hair, red_eyes, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 25.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sheema_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 12.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sheema_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 23.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sheema_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 21.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sheema_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 35.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sheema_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sheema_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | solo, 1girl, cape, weapon, white_background, armored_boots, gloves, shield, simple_background, full_body, shoulder_armor |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | solo | 1girl | cape | weapon | white_background | armored_boots | gloves | shield | simple_background | full_body | shoulder_armor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:-------|:---------|:-------------------|:----------------|:---------|:---------|:--------------------|:------------|:-----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sheema_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T08:03:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:07:15+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sheema (Fire Emblem)
===============================
This is the dataset of sheema (Fire Emblem), containing 16 images and their tags.
The core tags of this character are 'brown\_hair, long\_hair, red\_eyes, brown\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fcdf9d25c939b8f826ba7ad7adba9484b1513b37 |
# Dataset of guinevere (Fire Emblem)
This is the dataset of guinevere (Fire Emblem), containing 26 images and their tags.
The core tags of this character are `long_hair, blonde_hair, breasts, green_eyes, medium_breasts, bangs, shiny_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 35.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinevere_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 18.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinevere_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 46 | 34.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinevere_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 31.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinevere_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 46 | 51.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guinevere_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/guinevere_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, red_dress, solo, long_sleeves, necklace, long_dress, full_body, shiny, circlet, open_book, transparent_background, looking_at_viewer, cape, closed_mouth, collarbone, fur_trim, parted_bangs, white_background, crown, holding, open_mouth, pantyhose, smile |
| 1 | 6 |  |  |  |  |  | 1girl, necklace, solo, circlet, red_dress, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | red_dress | solo | long_sleeves | necklace | long_dress | full_body | shiny | circlet | open_book | transparent_background | looking_at_viewer | cape | closed_mouth | collarbone | fur_trim | parted_bangs | white_background | crown | holding | open_mouth | pantyhose | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------|:---------------|:-----------|:-------------|:------------|:--------|:----------|:------------|:-------------------------|:--------------------|:-------|:---------------|:-------------|:-----------|:---------------|:-------------------|:--------|:----------|:-------------|:------------|:--------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | | | | X | | | | | | | | | | | | | | X |
| CyberHarem/guinevere_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T08:03:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:09:30+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of guinevere (Fire Emblem)
==================================
This is the dataset of guinevere (Fire Emblem), containing 26 images and their tags.
The core tags of this character are 'long\_hair, blonde\_hair, breasts, green\_eyes, medium\_breasts, bangs, shiny\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
84080eec4b757a7c2489ee4a8865faaa5daa654e |
# Dataset of cornelia_arnim (Fire Emblem)
This is the dataset of cornelia_arnim (Fire Emblem), containing 40 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, green_eyes, pink_hair, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 74.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 36.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 99 | 79.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 65.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 99 | 125.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cornelia_arnim_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cornelia_arnim_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, solo, smile, circlet, cleavage, dress, looking_at_viewer, jewelry, simple_background, bare_shoulders, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | circlet | cleavage | dress | looking_at_viewer | jewelry | simple_background | bare_shoulders | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:----------|:-----------|:--------|:--------------------|:----------|:--------------------|:-----------------|:---------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/cornelia_arnim_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T08:03:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:12:05+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of cornelia\_arnim (Fire Emblem)
========================================
This is the dataset of cornelia\_arnim (Fire Emblem), containing 40 images and their tags.
The core tags of this character are 'breasts, long\_hair, large\_breasts, green\_eyes, pink\_hair, blue\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
4e81431b0a37edebc6d20871eb30b362aa9959b5 | # Dataset Card for "Gold-alpaca-legal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | bcijo/Gold-alpaca-legal | [
"region:us"
] | 2024-01-18T08:07:00+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "Unnamed: 0", "dtype": "int64"}, {"name": "Instruction", "dtype": "string"}, {"name": "Input", "dtype": "string"}, {"name": "Output", "dtype": "string"}, {"name": "Text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1466796, "num_examples": 127}], "download_size": 664187, "dataset_size": 1466796}} | 2024-01-18T08:07:04+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Gold-alpaca-legal"
More Information needed | [
"# Dataset Card for \"Gold-alpaca-legal\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Gold-alpaca-legal\"\n\nMore Information needed"
] |
93f7fb0697ca6cd2c7e5727dcba3b7200fa4014c |
Winogrande evaluation dataset for `llama.cpp` | ikawrakow/winogrande-eval-for-llama.cpp | [
"license:apache-2.0",
"region:us"
] | 2024-01-18T08:08:09+00:00 | {"license": "apache-2.0"} | 2024-01-18T08:12:13+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Winogrande evaluation dataset for 'URL' | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
07a1fc8327401a5ca19af00350512300589bdbc9 | **MULTI-TURN BAHASA INDONESIA**
**SYNTHETIC DATA**
```
Abstrak > Latar Belakang
Latar Belakang > Rumusan Masalah
Rumusan Masalah > Tujuan Penelitian
Tujuan Penelitian > Pembahasan
```
| arkanbima/MT-Penelitian.ID | [
"language:id",
"license:mit",
"region:us"
] | 2024-01-18T08:15:16+00:00 | {"language": ["id"], "license": "mit"} | 2024-01-20T10:19:22+00:00 | [] | [
"id"
] | TAGS
#language-Indonesian #license-mit #region-us
| MULTI-TURN BAHASA INDONESIA
SYNTHETIC DATA
| [] | [
"TAGS\n#language-Indonesian #license-mit #region-us \n"
] |
f6f93f819c18f436acdb7381658bf9feb66af5dd |
# Dataset of midoriko (Fire Emblem)
This is the dataset of midoriko (Fire Emblem), containing 36 images and their tags.
The core tags of this character are `green_hair, twintails, purple_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 36 | 33.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/midoriko_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 36 | 22.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/midoriko_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 67 | 40.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/midoriko_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 36 | 31.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/midoriko_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 67 | 53.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/midoriko_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/midoriko_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------|
| 0 | 36 |  |  |  |  |  | 1girl, smile, solo, blush, open_mouth, looking_at_viewer, japanese_clothes, holding, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | blush | open_mouth | looking_at_viewer | japanese_clothes | holding | long_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:-------------|:--------------------|:-------------------|:----------|:---------------|
| 0 | 36 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/midoriko_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T08:34:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:42:04+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of midoriko (Fire Emblem)
=================================
This is the dataset of midoriko (Fire Emblem), containing 36 images and their tags.
The core tags of this character are 'green\_hair, twintails, purple\_eyes, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5aa6ec01cb21ead3145f9f0288835146dd5becbc |
# Dataset of sara (Fire Emblem)
This is the dataset of sara (Fire Emblem), containing 40 images and their tags.
The core tags of this character are `long_hair, blue_eyes, purple_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 34.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 23.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 68 | 36.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 32.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 68 | 47.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sara_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------|
| 0 | 40 |  |  |  |  |  | 1girl, dress, solo, circlet, jewelry, smile, looking_at_viewer, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | solo | circlet | jewelry | smile | looking_at_viewer | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:----------|:----------|:--------|:--------------------|:--------------------|:-------------------|
| 0 | 40 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/sara_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T08:34:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:40:53+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of sara (Fire Emblem)
=============================
This is the dataset of sara (Fire Emblem), containing 40 images and their tags.
The core tags of this character are 'long\_hair, blue\_eyes, purple\_hair, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
3fec8ab62be2d969959ec05d21b996d8823604d9 |
# Dataset of rena (Fire Emblem)
This is the dataset of rena (Fire Emblem), containing 24 images and their tags.
The core tags of this character are `red_hair, red_eyes, long_hair, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 32.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rena_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 18.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rena_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 51 | 36.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rena_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 29.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rena_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 51 | 52.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rena_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rena_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, solo, looking_at_viewer, jewelry, white_dress, long_sleeves, smile, hood, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | jewelry | white_dress | long_sleeves | smile | hood | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:----------|:--------------|:---------------|:--------|:-------|:--------------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/rena_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T08:34:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:40:31+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of rena (Fire Emblem)
=============================
This is the dataset of rena (Fire Emblem), containing 24 images and their tags.
The core tags of this character are 'red\_hair, red\_eyes, long\_hair, bangs, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
d1c09eecaa0863ae6b367c1815161c949677fb21 | text | annyorange/Text-style-dataset | [
"region:us"
] | 2024-01-18T08:37:35+00:00 | {"dataset_info": {"features": [{"name": "init_image", "dtype": "image"}, {"name": "edit_prompt", "dtype": "string"}, {"name": "style_image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 673321.0, "num_examples": 275}], "download_size": 682460, "dataset_size": 673321.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-19T08:24:42+00:00 | [] | [] | TAGS
#region-us
| text | [] | [
"TAGS\n#region-us \n"
] |
a82982b7713def6b5df61b4a1a756a7404d8f822 | Dynamic Neural Architecture Optimization (DNAO)
I
Title: Dynamic Neural Architecture Optimization through Adaptive Meta-Learning for Enhanced AI Efficiency
Abstract:
In this paper, I propose a novel concept called "Dynamic Neural Architecture Optimization (DNAO) through Adaptive Meta-Learning," aimed at enhancing the efficiency
and accuracy of artificial intelligence systems. By integrating a self-evolving neural network architecture that adapts in real-time to specific problem requirements
with a meta-learning component capable of learning from past experiences, our approach can optimize performance while reducing computational costs. I'll try my best to
outline the various steps involved in developing an AI model based on this concept and discuss potential libraries, resources, and techniques useful for its implementation.
1. Initial Training:
This phase focuses on training a base model using various tasks or problems to establish an initial understanding of different neural network architectures'
effectiveness across different domains. The goal is to gather diverse experience that will serve as the foundation for meta-learning.
- Data collection and preprocessing: Gather datasets for various tasks (e.g., image recognition, NLP, speech recognition, time series analysis) and prepare the data by
normalizing, augmenting, and splitting it into training/validation/testing sets as needed. Libraries such as NumPy, pandas, and scikit-learn can help with data
manipulation and preprocessing tasks.
- Neural network architectures: Experiment with various neural network designs (e.g., Convolutional Neural Networks for image recognition or Recurrent Neural Networks
for time series analysis). Deep learning libraries like TensorFlow, PyTorch, or Keras can provide a wide range of prebuilt modules to create and train these models.
- Training loop setup: Implement a standard training loop that includes data loading, model initialization, optimization algorithm selection (e.g., Adam), and model
evaluation on the validation set using metrics like accuracy, loss, and AUC. Libraries like TensorFlow, PyTorch, or Keras offer built-in APIs for these tasks.
- Model storage: Store trained models in a format that can be easily retrieved later for meta-learning. The popular formats include HDF5 (using h5py library)
or JSON (with the json module).
Steps to take:
- Data collection and preprocessing:
* Gather datasets for various tasks (e.g., CIFAR-10 for image recognition, IMDB or AG News for NLP, TIDIGITS for speech recognition, or ECG5000 for time series analysis)
* Normalize the data if necessary using libraries like NumPy or scikit-learn
* Augment the data (if needed) to improve model generalization
* Split the dataset into training, validation, and testing sets
- Neural network architectures:
* Choose appropriate models based on the task type: Convolutional Neural Networks for image recognition (e.g., VGG, ResNet), Recurrent Neural Networks for
sequence data processing (e.g., LSTM, GRU), Transformers for NLP tasks (BERT, GPT-2/3), or Feedforward networks for speech and time series analysis
- Training loop setup:
* Initialize the chosen neural network model using a library like TensorFlow, PyTorch, or Keras
* Define a loss function (e.g., cross-entropy for classification tasks) and an optimizer algorithm (Adam, SGD)
* Create a training loop with forward propagation, backpropagation, and weight update steps
* Evaluate the model's performance on validation data after each epoch using metrics like accuracy, loss, and AUC
* Store the trained models in an appropriate format for future use (e.g., HDF5 or JSON)
2. Meta-Learning Phase: Here, we aim to develop a meta-learner that can observe and learn from the base model's performance during its training process to gain insights
into effective neural network designs, their strengths and weaknesses, and the factors influencing efficiency.
- Observe the base model: Track the base model's performance on various tasks at different stages of its training. Collect relevant metrics like accuracy,
loss function values, training time, and resource utilization to provide the meta-learner with a comprehensive understanding of the base model's learning
process and efficiency.
- Develop the meta-learner: Implement machine learning or deep learning algorithms to analyze and learn from the collected data. This learner could use techniques
like reinforcement learning, supervised learning, or unsupervised learning depending on the available data and desired outcomes.
Steps to take:
- Data collection for meta-learning: Collect performance metrics from the base models' training process, including accuracy, loss function values, training time,
and resource utilization. These data can be stored in a separate file or directly appended to the model checkpoint file. Libraries like NumPy and pandas can help
manage this data efficiently.
- Meta-learner design: Choose an appropriate machine learning algorithm (e.g., reinforcement learning with Proximal Policy Optimization, supervised learning with a
regression model, or unsupervised learning with autoencoders) to learn from the meta-data collected during base model training. Libraries like TensorFlow, PyTorch,
scikit-learn, and OpenAI Gym can provide support for different machine learning algorithms.
- Hyperparameter optimization: Fine-tune hyperparameters for both the base model's training loop and the meta-learner using techniques such as grid search or Bayesian
optimization. Libraries like scikit-opt, OptUNE, and Hyperopt can help optimize hyperparameters effectively.
- Meta-learning evaluation: Assess the performance of the meta-learner by testing it on new base models trained on different tasks and datasets. Compare its predictions
against ground truth (e.g., optimal architectures for specific problems) to evaluate its learning capabilities accurately.
3. Adaptive Architecture Generation: Based on the insights gained through meta-learning, develop an algorithm that generates customized neural network architectures
tailored to specific tasks or datasets. These architectures should be optimized for both accuracy and efficiency in a manner that dynamically adapts to new information.
- Architecture design space exploration: Generate a diverse set of possible neural network designs using different building blocks (e.g., convolutional layers, pooling
layers, recurrent layers, etc.) and connectivity patterns. These designs could range from simple to complex architectures depending on the problem's complexity and
available computational resources.
- Meta-learning-guided architecture selection: Use the insights gained from meta-learning to evaluate and rank these potential architectures based on factors like
historical performance, resource efficiency, and problem-specific features (e.g., spatial relationships for image tasks or temporal dependencies for time series
analysis).
- Adaptive architecture optimization: Apply genetic algorithms, gradient-based optimization methods, or other search techniques to refine the selected architectures
further in terms of both accuracy and resource utilization.
Steps to take:
- Architecture exploration: Implement a method to generate a diverse set of potential neural network designs based on different building blocks and connectivity patterns.
Libraries like TensorFlow or PyTorch provide useful modules (e.g., layers, optimizers) for constructing these architectures.
- Meta-learner integration: Integrate the meta-learner's insights into the architecture exploration process to rank and select candidate architectures based on their
potential performance in specific tasks or datasets. This could involve using machine learning models like Random Forests or Support Vector Machines for ranking.
- Architecture optimization: Fine-tune the selected architectures using techniques like gradient descent, genetic algorithms (using libraries such as DEAP), or Bayesian
optimization to improve their accuracy and efficiency.
- Model deployment: Incorporate the optimized neural network architecture into a new AI system that can solve specific tasks or datasets effectively.
4. Continuous Optimization:
Steps to take:
- Monitoring in-situ performance: Implement mechanisms to collect feedback metrics from the deployed AI system's operation in real-time. This could involve integrating
logging and monitoring tools like TensorBoard, Weave, or Prometheus for tracking key metrics such as accuracy, response times, resource utilization, and error rates.
- Feedback processing: Use these real-time feedback metrics to update the meta-learner's understanding of effective architectures for various scenarios. Libraries like
NumPy and pandas can help process this data.
- Dynamic architecture updates: Utilize the updated insights from the meta-learner to periodically reevaluate and possibly modify the deployed neural network
architecture in real-time, improving the AI system's efficiency. This step could involve retraining the base model or applying dynamic optimization techniques
like pruning, quantization, or knowledge distillation.
- Model retraining: Incorporate feedback from the deployed AI system's performance into the base model's training process to further enhance its understanding of
effective neural network architectures across different tasks and problem domains. This step might involve revisiting the initial training stage with updated data
and improved architecture suggestions.
note from limin:
imma keep it 100. I need help with this. i been working on this idea for a while but im not the most skilled. someone please help | 222limin/Dynamic-Neural-Architecture-Optimization | [
"license:other",
"region:us"
] | 2024-01-18T08:45:02+00:00 | {"license": "other", "license_name": "paper", "license_link": "LICENSE"} | 2024-01-18T09:15:57+00:00 | [] | [] | TAGS
#license-other #region-us
| Dynamic Neural Architecture Optimization (DNAO)
I
Title: Dynamic Neural Architecture Optimization through Adaptive Meta-Learning for Enhanced AI Efficiency
Abstract:
In this paper, I propose a novel concept called "Dynamic Neural Architecture Optimization (DNAO) through Adaptive Meta-Learning," aimed at enhancing the efficiency
and accuracy of artificial intelligence systems. By integrating a self-evolving neural network architecture that adapts in real-time to specific problem requirements
with a meta-learning component capable of learning from past experiences, our approach can optimize performance while reducing computational costs. I'll try my best to
outline the various steps involved in developing an AI model based on this concept and discuss potential libraries, resources, and techniques useful for its implementation.
1. Initial Training:
This phase focuses on training a base model using various tasks or problems to establish an initial understanding of different neural network architectures'
effectiveness across different domains. The goal is to gather diverse experience that will serve as the foundation for meta-learning.
- Data collection and preprocessing: Gather datasets for various tasks (e.g., image recognition, NLP, speech recognition, time series analysis) and prepare the data by
normalizing, augmenting, and splitting it into training/validation/testing sets as needed. Libraries such as NumPy, pandas, and scikit-learn can help with data
manipulation and preprocessing tasks.
- Neural network architectures: Experiment with various neural network designs (e.g., Convolutional Neural Networks for image recognition or Recurrent Neural Networks
for time series analysis). Deep learning libraries like TensorFlow, PyTorch, or Keras can provide a wide range of prebuilt modules to create and train these models.
- Training loop setup: Implement a standard training loop that includes data loading, model initialization, optimization algorithm selection (e.g., Adam), and model
evaluation on the validation set using metrics like accuracy, loss, and AUC. Libraries like TensorFlow, PyTorch, or Keras offer built-in APIs for these tasks.
- Model storage: Store trained models in a format that can be easily retrieved later for meta-learning. The popular formats include HDF5 (using h5py library)
or JSON (with the json module).
Steps to take:
- Data collection and preprocessing:
* Gather datasets for various tasks (e.g., CIFAR-10 for image recognition, IMDB or AG News for NLP, TIDIGITS for speech recognition, or ECG5000 for time series analysis)
* Normalize the data if necessary using libraries like NumPy or scikit-learn
* Augment the data (if needed) to improve model generalization
* Split the dataset into training, validation, and testing sets
- Neural network architectures:
* Choose appropriate models based on the task type: Convolutional Neural Networks for image recognition (e.g., VGG, ResNet), Recurrent Neural Networks for
sequence data processing (e.g., LSTM, GRU), Transformers for NLP tasks (BERT, GPT-2/3), or Feedforward networks for speech and time series analysis
- Training loop setup:
* Initialize the chosen neural network model using a library like TensorFlow, PyTorch, or Keras
* Define a loss function (e.g., cross-entropy for classification tasks) and an optimizer algorithm (Adam, SGD)
* Create a training loop with forward propagation, backpropagation, and weight update steps
* Evaluate the model's performance on validation data after each epoch using metrics like accuracy, loss, and AUC
* Store the trained models in an appropriate format for future use (e.g., HDF5 or JSON)
2. Meta-Learning Phase: Here, we aim to develop a meta-learner that can observe and learn from the base model's performance during its training process to gain insights
into effective neural network designs, their strengths and weaknesses, and the factors influencing efficiency.
- Observe the base model: Track the base model's performance on various tasks at different stages of its training. Collect relevant metrics like accuracy,
loss function values, training time, and resource utilization to provide the meta-learner with a comprehensive understanding of the base model's learning
process and efficiency.
- Develop the meta-learner: Implement machine learning or deep learning algorithms to analyze and learn from the collected data. This learner could use techniques
like reinforcement learning, supervised learning, or unsupervised learning depending on the available data and desired outcomes.
Steps to take:
- Data collection for meta-learning: Collect performance metrics from the base models' training process, including accuracy, loss function values, training time,
and resource utilization. These data can be stored in a separate file or directly appended to the model checkpoint file. Libraries like NumPy and pandas can help
manage this data efficiently.
- Meta-learner design: Choose an appropriate machine learning algorithm (e.g., reinforcement learning with Proximal Policy Optimization, supervised learning with a
regression model, or unsupervised learning with autoencoders) to learn from the meta-data collected during base model training. Libraries like TensorFlow, PyTorch,
scikit-learn, and OpenAI Gym can provide support for different machine learning algorithms.
- Hyperparameter optimization: Fine-tune hyperparameters for both the base model's training loop and the meta-learner using techniques such as grid search or Bayesian
optimization. Libraries like scikit-opt, OptUNE, and Hyperopt can help optimize hyperparameters effectively.
- Meta-learning evaluation: Assess the performance of the meta-learner by testing it on new base models trained on different tasks and datasets. Compare its predictions
against ground truth (e.g., optimal architectures for specific problems) to evaluate its learning capabilities accurately.
3. Adaptive Architecture Generation: Based on the insights gained through meta-learning, develop an algorithm that generates customized neural network architectures
tailored to specific tasks or datasets. These architectures should be optimized for both accuracy and efficiency in a manner that dynamically adapts to new information.
- Architecture design space exploration: Generate a diverse set of possible neural network designs using different building blocks (e.g., convolutional layers, pooling
layers, recurrent layers, etc.) and connectivity patterns. These designs could range from simple to complex architectures depending on the problem's complexity and
available computational resources.
- Meta-learning-guided architecture selection: Use the insights gained from meta-learning to evaluate and rank these potential architectures based on factors like
historical performance, resource efficiency, and problem-specific features (e.g., spatial relationships for image tasks or temporal dependencies for time series
analysis).
- Adaptive architecture optimization: Apply genetic algorithms, gradient-based optimization methods, or other search techniques to refine the selected architectures
further in terms of both accuracy and resource utilization.
Steps to take:
- Architecture exploration: Implement a method to generate a diverse set of potential neural network designs based on different building blocks and connectivity patterns.
Libraries like TensorFlow or PyTorch provide useful modules (e.g., layers, optimizers) for constructing these architectures.
- Meta-learner integration: Integrate the meta-learner's insights into the architecture exploration process to rank and select candidate architectures based on their
potential performance in specific tasks or datasets. This could involve using machine learning models like Random Forests or Support Vector Machines for ranking.
- Architecture optimization: Fine-tune the selected architectures using techniques like gradient descent, genetic algorithms (using libraries such as DEAP), or Bayesian
optimization to improve their accuracy and efficiency.
- Model deployment: Incorporate the optimized neural network architecture into a new AI system that can solve specific tasks or datasets effectively.
4. Continuous Optimization:
Steps to take:
- Monitoring in-situ performance: Implement mechanisms to collect feedback metrics from the deployed AI system's operation in real-time. This could involve integrating
logging and monitoring tools like TensorBoard, Weave, or Prometheus for tracking key metrics such as accuracy, response times, resource utilization, and error rates.
- Feedback processing: Use these real-time feedback metrics to update the meta-learner's understanding of effective architectures for various scenarios. Libraries like
NumPy and pandas can help process this data.
- Dynamic architecture updates: Utilize the updated insights from the meta-learner to periodically reevaluate and possibly modify the deployed neural network
architecture in real-time, improving the AI system's efficiency. This step could involve retraining the base model or applying dynamic optimization techniques
like pruning, quantization, or knowledge distillation.
- Model retraining: Incorporate feedback from the deployed AI system's performance into the base model's training process to further enhance its understanding of
effective neural network architectures across different tasks and problem domains. This step might involve revisiting the initial training stage with updated data
and improved architecture suggestions.
note from limin:
imma keep it 100. I need help with this. i been working on this idea for a while but im not the most skilled. someone please help | [] | [
"TAGS\n#license-other #region-us \n"
] |
2901e37ff5a10825284a7ffd7c2096e895dae31c |
<p align="center"style="font-size:32px;">
<strong>Walking Tours Dataset</strong>
</p>
<p align="center">
<img src="gifs/Wt_img.jpg" alt="Alt Text" width="80%" />
</p>
## Overview
The Walking Tours dataset is a unique collection of long-duration egocentric videos captured in urban environments from cities in Europe and Asia. It consists of 10 high-resolution videos, each showcasing a person walking through a different environment, ranging from city centers to parks to residential areas, under different lighting conditions. A video from a Wildlife safari is also included to diversify the dataset with natural environments. The dataset is completely unlabeled and uncurated, making it suitable for self-supervised pretraining.
## Cities Covered
The dataset encompasses walks through the following cities:
- Amsterdam
- Bangkok
- Chiang Mai
- Istanbul
- Kuala Lumpur
- Singapore
- Stockholm
- Venice
- Zurich
## Video Specifications
- **Resolution:** 4K (3840 × 2160 pixels)
- **Frame Rate:** 60 frames-per-second
- **License:** Creative Commons License (CC-BY)
## Duration
The videos vary in duration, offering a diverse range of content:
- Minimum Duration: 59 minutes (Wildlife safari)
- Maximum Duration: 2 hours 55 minutes (Bangkok)
- Average Duration: 1 hour 38 minutes
## Download the Dataset
The complete list of WTour videos are available in ```WTour.txt```, comprising the YouTube link and the corresponding city.
To download the dataset, we first install **pytube**
```
pip install pytube
```
then, we run
```
python download_WTours.py --output_folder <path_to_folder>
```
In order to comply with [GDPR](https://gdpr.eu/what-is-gdpr/), we also try to blur out all faces and license plates appearing in the video using [Deface](https://github.com/ORB-HD/deface)
To do this for all videos in WTour dataset:
```
python3 -m pip install deface
```
Then run Deface on all videos using the bash script:
```
chmod a+x gdpr_blur_faces.sh
./gdpr_blur_faces.sh
```
## Citation
If you find this work useful and use it on your own research, please cite our paper:
```
@inproceedings{venkataramanan2023imagenet,
title={Is ImageNet worth 1 video? Learning strong image encoders from 1 long unlabelled video},
author={Venkataramanan, Shashanka and Rizve, Mamshad Nayeem and Carreira, Jo{\~a}o and Asano, Yuki M and Avrithis, Yannis},
booktitle={International Conference on Learning Representations},
year={2024}
}
```
--- | shawshankvkt/Walking_Tours | [
"task_categories:image-classification",
"task_categories:image-to-video",
"size_categories:n<1K",
"language:en",
"license:cc-by-4.0",
"self-supervised learning",
"representation learning",
"region:us"
] | 2024-01-18T08:49:23+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["n<1K"], "task_categories": ["image-classification", "image-to-video"], "pretty_name": "Walking_Tours", "tags": ["self-supervised learning", "representation learning"]} | 2024-01-28T17:24:23+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-classification #task_categories-image-to-video #size_categories-n<1K #language-English #license-cc-by-4.0 #self-supervised learning #representation learning #region-us
|
<p align="center"style="font-size:32px;">
<strong>Walking Tours Dataset</strong>
</p>
<p align="center">
<img src="gifs/Wt_img.jpg" alt="Alt Text" width="80%" />
</p>
## Overview
The Walking Tours dataset is a unique collection of long-duration egocentric videos captured in urban environments from cities in Europe and Asia. It consists of 10 high-resolution videos, each showcasing a person walking through a different environment, ranging from city centers to parks to residential areas, under different lighting conditions. A video from a Wildlife safari is also included to diversify the dataset with natural environments. The dataset is completely unlabeled and uncurated, making it suitable for self-supervised pretraining.
## Cities Covered
The dataset encompasses walks through the following cities:
- Amsterdam
- Bangkok
- Chiang Mai
- Istanbul
- Kuala Lumpur
- Singapore
- Stockholm
- Venice
- Zurich
## Video Specifications
- Resolution: 4K (3840 × 2160 pixels)
- Frame Rate: 60 frames-per-second
- License: Creative Commons License (CC-BY)
## Duration
The videos vary in duration, offering a diverse range of content:
- Minimum Duration: 59 minutes (Wildlife safari)
- Maximum Duration: 2 hours 55 minutes (Bangkok)
- Average Duration: 1 hour 38 minutes
## Download the Dataset
The complete list of WTour videos are available in , comprising the YouTube link and the corresponding city.
To download the dataset, we first install pytube
then, we run
In order to comply with GDPR, we also try to blur out all faces and license plates appearing in the video using Deface
To do this for all videos in WTour dataset:
Then run Deface on all videos using the bash script:
If you find this work useful and use it on your own research, please cite our paper:
--- | [
"## Overview\n\nThe Walking Tours dataset is a unique collection of long-duration egocentric videos captured in urban environments from cities in Europe and Asia. It consists of 10 high-resolution videos, each showcasing a person walking through a different environment, ranging from city centers to parks to residential areas, under different lighting conditions. A video from a Wildlife safari is also included to diversify the dataset with natural environments. The dataset is completely unlabeled and uncurated, making it suitable for self-supervised pretraining.",
"## Cities Covered\n\nThe dataset encompasses walks through the following cities:\n\n- Amsterdam\n- Bangkok\n- Chiang Mai\n- Istanbul\n- Kuala Lumpur\n- Singapore\n- Stockholm\n- Venice\n- Zurich",
"## Video Specifications\n\n- Resolution: 4K (3840 × 2160 pixels)\n- Frame Rate: 60 frames-per-second\n- License: Creative Commons License (CC-BY)",
"## Duration\n\nThe videos vary in duration, offering a diverse range of content:\n\n- Minimum Duration: 59 minutes (Wildlife safari)\n- Maximum Duration: 2 hours 55 minutes (Bangkok)\n- Average Duration: 1 hour 38 minutes",
"## Download the Dataset\n\nThe complete list of WTour videos are available in , comprising the YouTube link and the corresponding city. \n\nTo download the dataset, we first install pytube\n\n\nthen, we run \n\n\nIn order to comply with GDPR, we also try to blur out all faces and license plates appearing in the video using Deface \n\nTo do this for all videos in WTour dataset:\n\nThen run Deface on all videos using the bash script:\n\n\n\nIf you find this work useful and use it on your own research, please cite our paper:\n\n\n---"
] | [
"TAGS\n#task_categories-image-classification #task_categories-image-to-video #size_categories-n<1K #language-English #license-cc-by-4.0 #self-supervised learning #representation learning #region-us \n",
"## Overview\n\nThe Walking Tours dataset is a unique collection of long-duration egocentric videos captured in urban environments from cities in Europe and Asia. It consists of 10 high-resolution videos, each showcasing a person walking through a different environment, ranging from city centers to parks to residential areas, under different lighting conditions. A video from a Wildlife safari is also included to diversify the dataset with natural environments. The dataset is completely unlabeled and uncurated, making it suitable for self-supervised pretraining.",
"## Cities Covered\n\nThe dataset encompasses walks through the following cities:\n\n- Amsterdam\n- Bangkok\n- Chiang Mai\n- Istanbul\n- Kuala Lumpur\n- Singapore\n- Stockholm\n- Venice\n- Zurich",
"## Video Specifications\n\n- Resolution: 4K (3840 × 2160 pixels)\n- Frame Rate: 60 frames-per-second\n- License: Creative Commons License (CC-BY)",
"## Duration\n\nThe videos vary in duration, offering a diverse range of content:\n\n- Minimum Duration: 59 minutes (Wildlife safari)\n- Maximum Duration: 2 hours 55 minutes (Bangkok)\n- Average Duration: 1 hour 38 minutes",
"## Download the Dataset\n\nThe complete list of WTour videos are available in , comprising the YouTube link and the corresponding city. \n\nTo download the dataset, we first install pytube\n\n\nthen, we run \n\n\nIn order to comply with GDPR, we also try to blur out all faces and license plates appearing in the video using Deface \n\nTo do this for all videos in WTour dataset:\n\nThen run Deface on all videos using the bash script:\n\n\n\nIf you find this work useful and use it on your own research, please cite our paper:\n\n\n---"
] |
3b3b827c5dad811225ed01523ffe8d6f1184d6a1 |
# Dataset of milady (Fire Emblem)
This is the dataset of milady (Fire Emblem), containing 15 images and their tags.
The core tags of this character are `red_hair, red_eyes, short_hair, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 12.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/milady_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 7.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/milady_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 12.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/milady_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 10.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/milady_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 17.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/milady_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/milady_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, red_armor, solo, circlet, elbow_gloves, jewelry, belt, boots, shoulder_armor, skirt, thighhighs, spear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | red_armor | solo | circlet | elbow_gloves | jewelry | belt | boots | shoulder_armor | skirt | thighhighs | spear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------|:----------|:---------------|:----------|:-------|:--------|:-----------------|:--------|:-------------|:--------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/milady_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T08:51:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T08:54:13+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of milady (Fire Emblem)
===============================
This is the dataset of milady (Fire Emblem), containing 15 images and their tags.
The core tags of this character are 'red\_hair, red\_eyes, short\_hair, earrings', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a46f5eb3d70a6a51b14c7b50991c71ffa46e8ab1 |
# Dataset of elimine (Fire Emblem)
This is the dataset of elimine (Fire Emblem), containing 17 images and their tags.
The core tags of this character are `blonde_hair, breasts, long_hair, green_eyes, very_long_hair, bangs, medium_breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 26.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 13.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 26.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 22.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 39.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elimine_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, smile, cape, white_dress, looking_at_viewer, elbow_gloves, holding, white_gloves, closed_mouth, simple_background, armlet, bracelet, long_dress, staff, full_body, open_mouth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | cape | white_dress | looking_at_viewer | elbow_gloves | holding | white_gloves | closed_mouth | simple_background | armlet | bracelet | long_dress | staff | full_body | open_mouth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------|:--------------|:--------------------|:---------------|:----------|:---------------|:---------------|:--------------------|:---------|:-----------|:-------------|:--------|:------------|:-------------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/elimine_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T09:06:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T09:09:53+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of elimine (Fire Emblem)
================================
This is the dataset of elimine (Fire Emblem), containing 17 images and their tags.
The core tags of this character are 'blonde\_hair, breasts, long\_hair, green\_eyes, very\_long\_hair, bangs, medium\_breasts, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2d5886ad57fb26e59d5ec4244f14b675e2e47fec |
# Dataset of annand (Fire Emblem)
This is the dataset of annand (Fire Emblem), containing 20 images and their tags.
The core tags of this character are `green_hair, long_hair, green_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 18.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/annand_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 12.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/annand_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 35 | 20.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/annand_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 16.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/annand_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 35 | 26.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/annand_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/annand_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, circlet, smile, breastplate, elbow_gloves, simple_background, thighhighs, white_background, belt, boots, closed_mouth, looking_at_viewer, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | circlet | smile | breastplate | elbow_gloves | simple_background | thighhighs | white_background | belt | boots | closed_mouth | looking_at_viewer | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------|:--------------|:---------------|:--------------------|:-------------|:-------------------|:-------|:--------|:---------------|:--------------------|:--------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/annand_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T09:06:47+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T09:11:10+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of annand (Fire Emblem)
===============================
This is the dataset of annand (Fire Emblem), containing 20 images and their tags.
The core tags of this character are 'green\_hair, long\_hair, green\_eyes, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
56ef97575cc25a1a5fcb73818e80cb6e1836ad5f |
# Dataset of syrene (Fire Emblem)
This is the dataset of syrene (Fire Emblem), containing 12 images and their tags.
The core tags of this character are `green_eyes, green_hair, long_hair, headband, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 10.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syrene_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syrene_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 27 | 15.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syrene_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 9.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syrene_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 27 | 18.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syrene_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/syrene_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, breastplate, white_gloves, looking_at_viewer, simple_background, thighhighs, white_background, belt, blush, boots, nipples, shoulder_armor, smile, sword |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | breastplate | white_gloves | looking_at_viewer | simple_background | thighhighs | white_background | belt | blush | boots | nipples | shoulder_armor | smile | sword |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:---------------|:--------------------|:--------------------|:-------------|:-------------------|:-------|:--------|:--------|:----------|:-----------------|:--------|:--------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/syrene_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T09:08:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T09:10:27+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of syrene (Fire Emblem)
===============================
This is the dataset of syrene (Fire Emblem), containing 12 images and their tags.
The core tags of this character are 'green\_eyes, green\_hair, long\_hair, headband, breasts, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
53075b5275f3c51dad8f8dd3bf1ab41a1d93286a | Dataset Summary
---------------
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise. For more detailed information please refer to the paper.
This dataset is a processed form of "dair-ai/emotion" dataset. [https://huggingface.co/datasets/dair-ai/emotion]
In this one, I have replicated/duplicated the samples for minority classes so that all the emotion classes have [approximate] equal sample count.
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype:
class_label:
names:
'0': sadness
'1': joy
'2': love
'3': anger
'4': fear
'5': surprise
splits:
- name: train
num_bytes: 3160217
num_examples: 28584
- name: validation
num_bytes: 214695
num_examples: 2000
- name: test
num_bytes: 217173
num_examples: 2000
download_size: 1294212
dataset_size: 3592085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
| manojkumarvohra/replicated_emotions | [
"region:us"
] | 2024-01-18T09:29:57+00:00 | {} | 2024-01-18T09:45:05+00:00 | [] | [] | TAGS
#region-us
| Dataset Summary
---------------
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise. For more detailed information please refer to the paper.
This dataset is a processed form of "dair-ai/emotion" dataset. [URL
In this one, I have replicated/duplicated the samples for minority classes so that all the emotion classes have [approximate] equal sample count.
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype:
class_label:
names:
'0': sadness
'1': joy
'2': love
'3': anger
'4': fear
'5': surprise
splits:
- name: train
num_bytes: 3160217
num_examples: 28584
- name: validation
num_bytes: 214695
num_examples: 2000
- name: test
num_bytes: 217173
num_examples: 2000
download_size: 1294212
dataset_size: 3592085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
| [] | [
"TAGS\n#region-us \n"
] |
2ced6f7d3fc6fe5e131d322fed53b5eadb1b448f |
# Dataset of saphy (Fire Emblem)
This is the dataset of saphy (Fire Emblem), containing 17 images and their tags.
The core tags of this character are `green_hair, green_eyes, long_hair, bangs, breasts, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 13.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saphy_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 8.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saphy_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 12.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saphy_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 11.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saphy_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 16.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saphy_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saphy_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, long_sleeves, hood, solo, jewelry, smile, white_background, full_body, holding_staff, looking_at_viewer, simple_background, long_dress, open_mouth, white_cape |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | hood | solo | jewelry | smile | white_background | full_body | holding_staff | looking_at_viewer | simple_background | long_dress | open_mouth | white_cape |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:-------|:----------|:--------|:-------------------|:------------|:----------------|:--------------------|:--------------------|:-------------|:-------------|:-------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/saphy_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T09:30:31+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T09:33:45+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of saphy (Fire Emblem)
==============================
This is the dataset of saphy (Fire Emblem), containing 17 images and their tags.
The core tags of this character are 'green\_hair, green\_eyes, long\_hair, bangs, breasts, hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2cdb92b3e2dcb5d1da1137a9ba32529c7db97937 |
## Sports Image Classification dataset
From Kaggle: [100 Sports Image Classification](https://www.kaggle.com/datasets/ponrajsubramaniian/sportclassificationdataset)
Collection of sports images covering 100 different sports. Images are 224x224x3 jpg format.
Data is separated into train, test and valid directories.
* 13493 train images
* 500 test images
* 500 validate images
Additionallly a csv file is included for those that wish to use it to create there own train, test and validation datasets.
### Clone
```
git clone https://huggingface.co/datasets/HES-XPLAIN/SportsImageClassification
```
or
```
git clone [email protected]:datasets/HES-XPLAIN/SportsImageClassification
```
### Modify dataset
To add data, ensure to install LFS.
```
git lfs install
```
Then proceed accordingly with `git add` and `git push`.
| HES-XPLAIN/SportsImageClassification | [
"license:cc0-1.0",
"region:us"
] | 2024-01-18T09:47:32+00:00 | {"license": "cc0-1.0"} | 2024-01-18T11:52:21+00:00 | [] | [] | TAGS
#license-cc0-1.0 #region-us
|
## Sports Image Classification dataset
From Kaggle: 100 Sports Image Classification
Collection of sports images covering 100 different sports. Images are 224x224x3 jpg format.
Data is separated into train, test and valid directories.
* 13493 train images
* 500 test images
* 500 validate images
Additionallly a csv file is included for those that wish to use it to create there own train, test and validation datasets.
### Clone
or
### Modify dataset
To add data, ensure to install LFS.
Then proceed accordingly with 'git add' and 'git push'.
| [
"## Sports Image Classification dataset\n\nFrom Kaggle: 100 Sports Image Classification\n\nCollection of sports images covering 100 different sports. Images are 224x224x3 jpg format.\n\nData is separated into train, test and valid directories.\n\n* 13493 train images\n* 500 test images\n* 500 validate images\n\nAdditionallly a csv file is included for those that wish to use it to create there own train, test and validation datasets.",
"### Clone\n\n\n\nor",
"### Modify dataset\n\nTo add data, ensure to install LFS.\n\n\n\nThen proceed accordingly with 'git add' and 'git push'."
] | [
"TAGS\n#license-cc0-1.0 #region-us \n",
"## Sports Image Classification dataset\n\nFrom Kaggle: 100 Sports Image Classification\n\nCollection of sports images covering 100 different sports. Images are 224x224x3 jpg format.\n\nData is separated into train, test and valid directories.\n\n* 13493 train images\n* 500 test images\n* 500 validate images\n\nAdditionallly a csv file is included for those that wish to use it to create there own train, test and validation datasets.",
"### Clone\n\n\n\nor",
"### Modify dataset\n\nTo add data, ensure to install LFS.\n\n\n\nThen proceed accordingly with 'git add' and 'git push'."
] |
aa87ec0dca77a05c9531692062655c294fa47bd3 | # Dataset Card for "vto_dress_train_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ares1123/vto_dress_train_data | [
"region:us"
] | 2024-01-18T10:01:27+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "conditioning_image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3926205095.0, "num_examples": 11647}], "download_size": 3925251404, "dataset_size": 3926205095.0}} | 2024-01-18T13:31:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vto_dress_train_data"
More Information needed | [
"# Dataset Card for \"vto_dress_train_data\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vto_dress_train_data\"\n\nMore Information needed"
] |
288f39cfbbec0feacc1527d229e562933a76fadf |
# Dataset of dezel (Fire Emblem)
This is the dataset of dezel (Fire Emblem), containing 10 images and their tags.
The core tags of this character are `short_hair, black_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 10.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dezel_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dezel_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 19 | 12.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dezel_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 9.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dezel_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 19 | 14.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dezel_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dezel_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, gauntlets, polearm, armored_boots, breastplate, holding_weapon, looking_at_viewer, shield |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | gauntlets | polearm | armored_boots | breastplate | holding_weapon | looking_at_viewer | shield |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:----------|:----------------|:--------------|:-----------------|:--------------------|:---------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/dezel_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-18T10:06:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-18T10:12:04+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of dezel (Fire Emblem)
==============================
This is the dataset of dezel (Fire Emblem), containing 10 images and their tags.
The core tags of this character are 'short\_hair, black\_hair, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
aead1c27c2ef21558ce56824b02512df99ad654d | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | georgefraiha/momentai | [
"task_categories:table-question-answering",
"size_categories:10B<n<100B",
"language:en",
"language:es",
"language:fr",
"language:it",
"region:us"
] | 2024-01-18T10:50:53+00:00 | {"language": ["en", "es", "fr", "it"], "size_categories": ["10B<n<100B"], "task_categories": ["table-question-answering"]} | 2024-01-18T11:04:14+00:00 | [] | [
"en",
"es",
"fr",
"it"
] | TAGS
#task_categories-table-question-answering #size_categories-10B<n<100B #language-English #language-Spanish #language-French #language-Italian #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-table-question-answering #size_categories-10B<n<100B #language-English #language-Spanish #language-French #language-Italian #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
871ca0ab8a8f62d82059d41de32e7ca72dc17b8e |
# Dataset Card for Evaluation run of wang7776/vicuna-7b-v1.3-sparsity-10
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/vicuna-7b-v1.3-sparsity-10](https://huggingface.co/wang7776/vicuna-7b-v1.3-sparsity-10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-sparsity-10",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T11:22:25.072953](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-sparsity-10/blob/main/results_2024-01-18T11-22-25.072953.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47881736198022445,
"acc_stderr": 0.03436036561115092,
"acc_norm": 0.48497893053921287,
"acc_norm_stderr": 0.03513680059331847,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.01622075676952092,
"mc2": 0.46883380227445837,
"mc2_stderr": 0.015097398934278281
},
"harness|arc:challenge|25": {
"acc": 0.47013651877133106,
"acc_stderr": 0.0145853058400071,
"acc_norm": 0.514505119453925,
"acc_norm_stderr": 0.014605241081370053
},
"harness|hellaswag|10": {
"acc": 0.5793666600278828,
"acc_stderr": 0.004926518439372264,
"acc_norm": 0.7697669786895041,
"acc_norm_stderr": 0.004201215520808244
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.03067609659938918,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.03067609659938918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.031410821975962386,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.031410821975962386
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.03340361906276585,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.03340361906276585
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46923076923076923,
"acc_stderr": 0.02530295889085015,
"acc_norm": 0.46923076923076923,
"acc_norm_stderr": 0.02530295889085015
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.636697247706422,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.636697247706422,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.033933885849584046,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.033933885849584046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6497890295358649,
"acc_stderr": 0.03105239193758435,
"acc_norm": 0.6497890295358649,
"acc_norm_stderr": 0.03105239193758435
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836185,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836185
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787275,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787275
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.030118210106942645,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.030118210106942645
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.016967031766413624,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.016967031766413624
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088013,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088013
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489424,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489424
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607704,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3617992177314211,
"acc_stderr": 0.012272736233262941,
"acc_norm": 0.3617992177314211,
"acc_norm_stderr": 0.012272736233262941
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.030254372573976687,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.030254372573976687
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.020017629214213097,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.020017629214213097
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.036602988340491624,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.036602988340491624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.01622075676952092,
"mc2": 0.46883380227445837,
"mc2_stderr": 0.015097398934278281
},
"harness|winogrande|5": {
"acc": 0.6977111286503551,
"acc_stderr": 0.012907200361627538
},
"harness|gsm8k|5": {
"acc": 0.13115996967399546,
"acc_stderr": 0.009298499235587877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-sparsity-10 | [
"region:us"
] | 2024-01-18T11:24:16+00:00 | {"pretty_name": "Evaluation run of wang7776/vicuna-7b-v1.3-sparsity-10", "dataset_summary": "Dataset automatically created during the evaluation run of model [wang7776/vicuna-7b-v1.3-sparsity-10](https://huggingface.co/wang7776/vicuna-7b-v1.3-sparsity-10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-sparsity-10\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T11:22:25.072953](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__vicuna-7b-v1.3-sparsity-10/blob/main/results_2024-01-18T11-22-25.072953.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47881736198022445,\n \"acc_stderr\": 0.03436036561115092,\n \"acc_norm\": 0.48497893053921287,\n \"acc_norm_stderr\": 0.03513680059331847,\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.01622075676952092,\n \"mc2\": 0.46883380227445837,\n \"mc2_stderr\": 0.015097398934278281\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.47013651877133106,\n \"acc_stderr\": 0.0145853058400071,\n \"acc_norm\": 0.514505119453925,\n \"acc_norm_stderr\": 0.014605241081370053\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5793666600278828,\n \"acc_stderr\": 0.004926518439372264,\n \"acc_norm\": 0.7697669786895041,\n \"acc_norm_stderr\": 0.004201215520808244\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.03067609659938918,\n \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.03067609659938918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.031410821975962386,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.031410821975962386\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.024130158299762613,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.024130158299762613\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.03340361906276585,\n \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.03340361906276585\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.46923076923076923,\n \"acc_stderr\": 0.02530295889085015,\n \"acc_norm\": 0.46923076923076923,\n \"acc_norm_stderr\": 0.02530295889085015\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.636697247706422,\n \"acc_stderr\": 0.020620603919625804,\n \"acc_norm\": 0.636697247706422,\n \"acc_norm_stderr\": 0.020620603919625804\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.033933885849584046,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.033933885849584046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6497890295358649,\n \"acc_stderr\": 0.03105239193758435,\n \"acc_norm\": 0.6497890295358649,\n \"acc_norm_stderr\": 0.03105239193758435\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836185,\n \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836185\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787275,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787275\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n \"acc_stderr\": 0.030118210106942645,\n \"acc_norm\": 0.6965811965811965,\n \"acc_norm_stderr\": 0.030118210106942645\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n \"acc_stderr\": 0.016967031766413624,\n \"acc_norm\": 0.6577266922094508,\n \"acc_norm_stderr\": 0.016967031766413624\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382868,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382868\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088013,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088013\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n \"acc_stderr\": 0.02827435985489424,\n \"acc_norm\": 0.5466237942122186,\n \"acc_norm_stderr\": 0.02827435985489424\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607704,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.02840662780959095,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.02840662780959095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3617992177314211,\n \"acc_stderr\": 0.012272736233262941,\n \"acc_norm\": 0.3617992177314211,\n \"acc_norm_stderr\": 0.012272736233262941\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.030254372573976687,\n \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.030254372573976687\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.020017629214213097,\n \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.020017629214213097\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.036602988340491624,\n \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.036602988340491624\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.01622075676952092,\n \"mc2\": 0.46883380227445837,\n \"mc2_stderr\": 0.015097398934278281\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6977111286503551,\n \"acc_stderr\": 0.012907200361627538\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \"acc_stderr\": 0.009298499235587877\n }\n}\n```", "repo_url": "https://huggingface.co/wang7776/vicuna-7b-v1.3-sparsity-10", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|arc:challenge|25_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|gsm8k|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hellaswag|10_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T11-22-25.072953.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["**/details_harness|winogrande|5_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T11-22-25.072953.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T11_22_25.072953", "path": ["results_2024-01-18T11-22-25.072953.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T11-22-25.072953.parquet"]}]}]} | 2024-01-18T11:24:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of wang7776/vicuna-7b-v1.3-sparsity-10
Dataset automatically created during the evaluation run of model wang7776/vicuna-7b-v1.3-sparsity-10 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T11:22:25.072953(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of wang7776/vicuna-7b-v1.3-sparsity-10\n\n\n\nDataset automatically created during the evaluation run of model wang7776/vicuna-7b-v1.3-sparsity-10 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T11:22:25.072953(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wang7776/vicuna-7b-v1.3-sparsity-10\n\n\n\nDataset automatically created during the evaluation run of model wang7776/vicuna-7b-v1.3-sparsity-10 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T11:22:25.072953(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e4fad21814d8818d0dedb16c4894442930967367 |
Source: https://universe.roboflow.com/centro-oncolgico-integral/ecg_labeled_marzo | brainer/ecg_labeled_Marzo | [
"region:us"
] | 2024-01-18T11:35:31+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "width", "dtype": "int64"}, {"name": "height", "dtype": "int64"}, {"name": "pixel_values", "sequence": {"sequence": {"sequence": "uint8"}}}, {"name": "image_id", "dtype": "int64"}, {"name": "image_path", "dtype": "string"}, {"name": "objects", "struct": [{"name": "area", "sequence": "int64"}, {"name": "bbox", "sequence": {"sequence": "int64"}}, {"name": "category", "sequence": "int64"}, {"name": "id", "sequence": "int64"}]}], "splits": [{"name": "train", "num_bytes": 583394854.0, "num_examples": 199}, {"name": "test", "num_bytes": 82413124.0, "num_examples": 28}, {"name": "valid", "num_bytes": 167349293.0, "num_examples": 57}], "download_size": 166870694, "dataset_size": 833157271.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}]} | 2024-01-19T12:43:33+00:00 | [] | [] | TAGS
#region-us
|
Source: URL | [] | [
"TAGS\n#region-us \n"
] |
2053a4df1da3642397cf58fa24c91989ea9c14ce |
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T11:52:33.965270](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20-v0.1/blob/main/results_2024-01-18T11-52-33.965270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6063464168939487,
"acc_stderr": 0.03319788955413147,
"acc_norm": 0.6107282081056654,
"acc_norm_stderr": 0.033872604106245235,
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961567,
"mc2": 0.6766170541145112,
"mc2_stderr": 0.015233309088000017
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.01440982551840308,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192601
},
"harness|hellaswag|10": {
"acc": 0.6656044612626967,
"acc_stderr": 0.004708145393411386,
"acc_norm": 0.8490340569607648,
"acc_norm_stderr": 0.0035728399695219857
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849723,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849723
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315525,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03053289223393202,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03053289223393202
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.02830465794303531,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.02830465794303531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.039800662464677665,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.039800662464677665
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139963,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139963
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30726256983240224,
"acc_stderr": 0.01543015884646961,
"acc_norm": 0.30726256983240224,
"acc_norm_stderr": 0.01543015884646961
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616302,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616302
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.01959402113657744,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.01959402113657744
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961567,
"mc2": 0.6766170541145112,
"mc2_stderr": 0.015233309088000017
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205198
},
"harness|gsm8k|5": {
"acc": 0.40106141015921154,
"acc_stderr": 0.013500158922245535
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20-v0.1 | [
"region:us"
] | 2024-01-18T11:54:50+00:00 | {"pretty_name": "Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T11:52:33.965270](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20-v0.1/blob/main/results_2024-01-18T11-52-33.965270.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6063464168939487,\n \"acc_stderr\": 0.03319788955413147,\n \"acc_norm\": 0.6107282081056654,\n \"acc_norm_stderr\": 0.033872604106245235,\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6766170541145112,\n \"mc2_stderr\": 0.015233309088000017\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.01440982551840308,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192601\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6656044612626967,\n \"acc_stderr\": 0.004708145393411386,\n \"acc_norm\": 0.8490340569607648,\n \"acc_norm_stderr\": 0.0035728399695219857\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315525,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315525\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03053289223393202,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03053289223393202\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885117,\n \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885117\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303531,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303531\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.039800662464677665,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.039800662464677665\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n \"acc_stderr\": 0.014711684386139963,\n \"acc_norm\": 0.7841634738186463,\n \"acc_norm_stderr\": 0.014711684386139963\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30726256983240224,\n \"acc_stderr\": 0.01543015884646961,\n \"acc_norm\": 0.30726256983240224,\n \"acc_norm_stderr\": 0.01543015884646961\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n \"acc_stderr\": 0.012661233805616302,\n \"acc_norm\": 0.4348109517601043,\n \"acc_norm_stderr\": 0.012661233805616302\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5238678090575275,\n \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6766170541145112,\n \"mc2_stderr\": 0.015233309088000017\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205198\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40106141015921154,\n \"acc_stderr\": 0.013500158922245535\n }\n}\n```", "repo_url": "https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|arc:challenge|25_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|gsm8k|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hellaswag|10_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T11-52-33.965270.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["**/details_harness|winogrande|5_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T11-52-33.965270.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T11_52_33.965270", "path": ["results_2024-01-18T11-52-33.965270.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T11-52-33.965270.parquet"]}]}]} | 2024-01-18T11:55:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1
Dataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T11:52:33.965270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1\n\n\n\nDataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T11:52:33.965270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1\n\n\n\nDataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-20-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T11:52:33.965270(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
218446e485d0dfe8b3971df0382f3d35bc5f7178 | # Dataset Card for "mmCodeQL89"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | euisuh15/mmCodeQL89 | [
"region:us"
] | 2024-01-18T11:59:34+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "codes", "dtype": "string"}, {"name": "cwe", "dtype": "string"}, {"name": "codeql", "dtype": "string"}, {"name": "codeql_cwes", "dtype": "string"}, {"name": "codeql_correct", "dtype": "bool"}, {"name": "poison_code", "dtype": "string"}, {"name": "is_poison", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 2486992, "num_examples": 749}, {"name": "test", "num_bytes": 647266, "num_examples": 188}], "download_size": 625385, "dataset_size": 3134258}} | 2024-01-18T11:59:39+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mmCodeQL89"
More Information needed | [
"# Dataset Card for \"mmCodeQL89\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mmCodeQL89\"\n\nMore Information needed"
] |
5a87d043213bf96203d57a64822fe5418559f23b | ### About dataset
It is a dataset of ukrainian audiobooks
Each sample contain an approximately 8 seconds od ukrainian speech
### Loading script
```
>>> load_dataset("Zarakun/audiobooks_ua_test")
```
### Dataset structure
**Every example has the following:
**audio** - the waveform
**rate** - the sampling rate of the waveform
**file_id** - the id of the speaker
**duration** - the duration of the video in seconds
**sentence** - the transcript of the video
| Zarakun/audiobooks_ua_test | [
"audio",
"region:us"
] | 2024-01-18T12:06:15+00:00 | {"tags": ["audio"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train.parquet"}]}]} | 2024-01-18T16:15:59+00:00 | [] | [] | TAGS
#audio #region-us
| ### About dataset
It is a dataset of ukrainian audiobooks
Each sample contain an approximately 8 seconds od ukrainian speech
### Loading script
### Dataset structure
Every example has the following:
audio - the waveform
rate - the sampling rate of the waveform
file_id - the id of the speaker
duration - the duration of the video in seconds
sentence - the transcript of the video
| [
"### About dataset\nIt is a dataset of ukrainian audiobooks \nEach sample contain an approximately 8 seconds od ukrainian speech",
"### Loading script",
"### Dataset structure\nEvery example has the following: \naudio - the waveform \nrate - the sampling rate of the waveform \nfile_id - the id of the speaker \nduration - the duration of the video in seconds \nsentence - the transcript of the video"
] | [
"TAGS\n#audio #region-us \n",
"### About dataset\nIt is a dataset of ukrainian audiobooks \nEach sample contain an approximately 8 seconds od ukrainian speech",
"### Loading script",
"### Dataset structure\nEvery example has the following: \naudio - the waveform \nrate - the sampling rate of the waveform \nfile_id - the id of the speaker \nduration - the duration of the video in seconds \nsentence - the transcript of the video"
] |
6e46cbbca35dcc3a8395e7fb728fab8fb3ecc224 |
# Dataset Card for Evaluation run of kodonho/Momo-70b-DPO-mixed
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kodonho/Momo-70b-DPO-mixed](https://huggingface.co/kodonho/Momo-70b-DPO-mixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:09:45.590059](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed/blob/main/results_2024-01-18T12-09-45.590059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2318376299932094,
"acc_stderr": 0.02998135076289475,
"acc_norm": 0.23146455622350848,
"acc_norm_stderr": 0.03077051926756787,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.48846362378663954,
"mc2_stderr": 0.016303812688575184
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202609,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.2577175861382195,
"acc_stderr": 0.004364838000335622,
"acc_norm": 0.24975104560844452,
"acc_norm_stderr": 0.004319842107724391
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.0363338441407346,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.0363338441407346
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.02575755989310675,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.02575755989310675
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.19743589743589743,
"acc_stderr": 0.02018264696867484,
"acc_norm": 0.19743589743589743,
"acc_norm_stderr": 0.02018264696867484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.24663677130044842,
"acc_stderr": 0.028930413120910874,
"acc_norm": 0.24663677130044842,
"acc_norm_stderr": 0.028930413120910874
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0413311944024384,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0413311944024384
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.02999695185834949,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.02999695185834949
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21328224776500637,
"acc_stderr": 0.01464817274959353,
"acc_norm": 0.21328224776500637,
"acc_norm_stderr": 0.01464817274959353
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.48846362378663954,
"mc2_stderr": 0.016303812688575184
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612981
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed | [
"region:us"
] | 2024-01-18T12:11:53+00:00 | {"pretty_name": "Evaluation run of kodonho/Momo-70b-DPO-mixed", "dataset_summary": "Dataset automatically created during the evaluation run of model [kodonho/Momo-70b-DPO-mixed](https://huggingface.co/kodonho/Momo-70b-DPO-mixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T12:09:45.590059](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed/blob/main/results_2024-01-18T12-09-45.590059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2318376299932094,\n \"acc_stderr\": 0.02998135076289475,\n \"acc_norm\": 0.23146455622350848,\n \"acc_norm_stderr\": 0.03077051926756787,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.48846362378663954,\n \"mc2_stderr\": 0.016303812688575184\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202609,\n \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351335\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2577175861382195,\n \"acc_stderr\": 0.004364838000335622,\n \"acc_norm\": 0.24975104560844452,\n \"acc_norm_stderr\": 0.004319842107724391\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.0363338441407346,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.0363338441407346\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.02575755989310675,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.02575755989310675\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.19743589743589743,\n \"acc_stderr\": 0.02018264696867484,\n \"acc_norm\": 0.19743589743589743,\n \"acc_norm_stderr\": 0.02018264696867484\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.24663677130044842,\n \"acc_stderr\": 0.028930413120910874,\n \"acc_norm\": 0.24663677130044842,\n \"acc_norm_stderr\": 0.028930413120910874\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.0413311944024384,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0413311944024384\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n \"acc_stderr\": 0.02999695185834949,\n \"acc_norm\": 0.29914529914529914,\n \"acc_norm_stderr\": 0.02999695185834949\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21328224776500637,\n \"acc_stderr\": 0.01464817274959353,\n \"acc_norm\": 0.21328224776500637,\n \"acc_norm_stderr\": 0.01464817274959353\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.48846362378663954,\n \"mc2_stderr\": 0.016303812688575184\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.014044390401612981\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/kodonho/Momo-70b-DPO-mixed", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["**/details_harness|winogrande|5_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T12-09-45.590059.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T12_09_45.590059", "path": ["results_2024-01-18T12-09-45.590059.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T12-09-45.590059.parquet"]}]}]} | 2024-01-18T12:12:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kodonho/Momo-70b-DPO-mixed
Dataset automatically created during the evaluation run of model kodonho/Momo-70b-DPO-mixed on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T12:09:45.590059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kodonho/Momo-70b-DPO-mixed\n\n\n\nDataset automatically created during the evaluation run of model kodonho/Momo-70b-DPO-mixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:09:45.590059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kodonho/Momo-70b-DPO-mixed\n\n\n\nDataset automatically created during the evaluation run of model kodonho/Momo-70b-DPO-mixed on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:09:45.590059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f6baf247c3852b4bbf41acc305a5e5398900ef09 |
# Dataset Card for Evaluation run of chargoddard/internlm2-7b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/internlm2-7b-llama](https://huggingface.co/chargoddard/internlm2-7b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__internlm2-7b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:13:04.932540](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-7b-llama/blob/main/results_2024-01-18T12-13-04.932540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6329686690164976,
"acc_stderr": 0.03232315628356604,
"acc_norm": 0.634123920290497,
"acc_norm_stderr": 0.032983274678832275,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5425133114047686,
"mc2_stderr": 0.015593910488675748
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137994,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938169
},
"harness|hellaswag|10": {
"acc": 0.6154152559251145,
"acc_stderr": 0.004855027248398157,
"acc_norm": 0.8098984266082454,
"acc_norm_stderr": 0.003915792315457799
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800897,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800897
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462457,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462457
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176095,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176095
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342856,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342856
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.0165952597103993,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.0165952597103993
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508762,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508762
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.047184714852195865,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.047184714852195865
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217573,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542613,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542613
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537375,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574516,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574516
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125468,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125468
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017492,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5425133114047686,
"mc2_stderr": 0.015593910488675748
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.011268519971577684
},
"harness|gsm8k|5": {
"acc": 0.6285064442759667,
"acc_stderr": 0.013309839075706488
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chargoddard__internlm2-7b-llama | [
"region:us"
] | 2024-01-18T12:15:09+00:00 | {"pretty_name": "Evaluation run of chargoddard/internlm2-7b-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/internlm2-7b-llama](https://huggingface.co/chargoddard/internlm2-7b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__internlm2-7b-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T12:13:04.932540](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-7b-llama/blob/main/results_2024-01-18T12-13-04.932540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6329686690164976,\n \"acc_stderr\": 0.03232315628356604,\n \"acc_norm\": 0.634123920290497,\n \"acc_norm_stderr\": 0.032983274678832275,\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5425133114047686,\n \"mc2_stderr\": 0.015593910488675748\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137994,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938169\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6154152559251145,\n \"acc_stderr\": 0.004855027248398157,\n \"acc_norm\": 0.8098984266082454,\n \"acc_norm_stderr\": 0.003915792315457799\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800897,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800897\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462457,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462457\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176095,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176095\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342856,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342856\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.0165952597103993,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.0165952597103993\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508762,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508762\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.047184714852195865,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.047184714852195865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217573,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217573\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542613,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542613\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537375,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537375\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n \"acc_stderr\": 0.012678037478574516,\n \"acc_norm\": 0.44002607561929596,\n \"acc_norm_stderr\": 0.012678037478574516\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125468,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125468\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017492,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5425133114047686,\n \"mc2_stderr\": 0.015593910488675748\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.011268519971577684\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6285064442759667,\n \"acc_stderr\": 0.013309839075706488\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/internlm2-7b-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-13-04.932540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["**/details_harness|winogrande|5_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T12-13-04.932540.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T12_13_04.932540", "path": ["results_2024-01-18T12-13-04.932540.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T12-13-04.932540.parquet"]}]}]} | 2024-01-18T12:15:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/internlm2-7b-llama
Dataset automatically created during the evaluation run of model chargoddard/internlm2-7b-llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T12:13:04.932540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of chargoddard/internlm2-7b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-7b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:13:04.932540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/internlm2-7b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-7b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:13:04.932540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cbdbff8e1247b4b1e492e15fa32bb6b55e8b8d00 | This dataset cleaned and dowloaded version of following dataset: https://huggingface.co/datasets/visheratin/laion-coco-nllb
The main purpose was to extract Turkish captions. | umarigan/turkish_clip_dataset_with_text_embeddings | [
"region:us"
] | 2024-01-18T12:26:38+00:00 | {} | 2024-01-19T15:22:15+00:00 | [] | [] | TAGS
#region-us
| This dataset cleaned and dowloaded version of following dataset: URL
The main purpose was to extract Turkish captions. | [] | [
"TAGS\n#region-us \n"
] |
2f9385f72e86d997443040f29f0005d328e947e4 |
# Dataset Card for Evaluation run of liminerity/Blurstral-7b-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blurstral-7b-slerp](https://huggingface.co/liminerity/Blurstral-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blurstral-7b-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:26:10.788349](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blurstral-7b-slerp/blob/main/results_2024-01-18T12-26-10.788349.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653631806299437,
"acc_stderr": 0.03191715400715241,
"acc_norm": 0.6553861585583183,
"acc_norm_stderr": 0.03256137277170426,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5340041664977683,
"mc2_stderr": 0.014980934499947232
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902272
},
"harness|hellaswag|10": {
"acc": 0.6571400119498108,
"acc_stderr": 0.00473695081061779,
"acc_norm": 0.8538139812786297,
"acc_norm_stderr": 0.0035257057733534183
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406796,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461773,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461773
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678495,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678495
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5340041664977683,
"mc2_stderr": 0.014980934499947232
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.010941877955676207
},
"harness|gsm8k|5": {
"acc": 0.6285064442759667,
"acc_stderr": 0.013309839075706487
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liminerity__Blurstral-7b-slerp | [
"region:us"
] | 2024-01-18T12:28:29+00:00 | {"pretty_name": "Evaluation run of liminerity/Blurstral-7b-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/Blurstral-7b-slerp](https://huggingface.co/liminerity/Blurstral-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blurstral-7b-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T12:26:10.788349](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blurstral-7b-slerp/blob/main/results_2024-01-18T12-26-10.788349.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653631806299437,\n \"acc_stderr\": 0.03191715400715241,\n \"acc_norm\": 0.6553861585583183,\n \"acc_norm_stderr\": 0.03256137277170426,\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5340041664977683,\n \"mc2_stderr\": 0.014980934499947232\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902272\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6571400119498108,\n \"acc_stderr\": 0.00473695081061779,\n \"acc_norm\": 0.8538139812786297,\n \"acc_norm_stderr\": 0.0035257057733534183\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406796,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406796\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461773,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461773\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n \"acc_stderr\": 0.016232826818678495,\n \"acc_norm\": 0.37988826815642457,\n \"acc_norm_stderr\": 0.016232826818678495\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5340041664977683,\n \"mc2_stderr\": 0.014980934499947232\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6285064442759667,\n \"acc_stderr\": 0.013309839075706487\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/Blurstral-7b-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-26-10.788349.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["**/details_harness|winogrande|5_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T12-26-10.788349.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T12_26_10.788349", "path": ["results_2024-01-18T12-26-10.788349.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T12-26-10.788349.parquet"]}]}]} | 2024-01-18T12:28:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liminerity/Blurstral-7b-slerp
Dataset automatically created during the evaluation run of model liminerity/Blurstral-7b-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T12:26:10.788349(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liminerity/Blurstral-7b-slerp\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blurstral-7b-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:26:10.788349(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liminerity/Blurstral-7b-slerp\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blurstral-7b-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:26:10.788349(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f1ea12f9ce8dd42e805a4dde171c87e0fba21fa1 | # 1. Introduction
(1) As far as we know, this is the largest QA dataset for Chinese Construction Laws and Regulations (CCLR). For example, well-known datasets like c-eval typically contain only about 500 questions in a single domain, whereas our dataset specifically focuses on the CCLR domain and includes 6,339 questions..
(2) The dataset is developed and maintained by Southeast University, University of Cambridge, and City University of Hong Kong.
(3) Make sure to read the specification and follow the rules.
# 2. Submission of your LLM’s answers
The answers could be submitted through https://forms.gle/bKLj6GgyxSnGenXS8. Please use “Template of answer submission.xls” in this repository to submit your LLM's answers
# 3. Citation requirement
The reuse of this repository requires citation. Should any individual or entity utilize this repository without appropriate acknowledgment and citation, they do not have the right to use our data. We will take measures to protect our copyright, including, but not limited to, retracting their papers and initiating legal action.
# 4.LLM Leaderboard for CCLR QA
| Large Language Model | Contributors | Overall Scoring Rate | D1 | D2 | D3 | D4 | D5 | D6 | D7 | D8 | Ranking |
|-----|-----|-----|-----|-----|-----|-----|-----|-----|------|------|------|
| ERNIE-Bot 4.0 with knowledge graph | Baidu & The authors | 0.822 | 0.842 | 0.826 | 0.830 | 0.801 | 0.853 | 0.842 | 0.800 | 0.862 | 1 |
| ERNIE-Bot 4.0 | Baidu | 0.757 | 0.783 | 0.718 | 0.762 | 0.768 | 0.724 | 0.724 | 0.731 | 0.788 | 2 |
| GPT-4 with knowledge graph | OpenAI & The authors | 0.666 | 0.719 | 0.734 | 0.661 | 0.660 | 0.757 | 0.681 | 0.664 | 0.689 | 3 |
| GPT-4 | OpenAI | 0.532 | 0.602 | 0.490 | 0.556 | 0.536 | 0.570 | 0.519 | 0.514 | 0.566 | 4 |
| GPT-3.5-turbo with knowledge graph | OpenAI & The authors | 0.504 | 0.532 | 0.503 | 0.527 | 0.472 | 0.626 | 0.522 | 0.540 | 0.467 | 5 |
| ChatGLM3-6B with knowledge graph | Tsinghua, Zhipu.AI & The authors | 0.483 | 0.497 | 0.444 | 0.510 | 0.421 | 0.540 | 0.596 | 0.543 | 0.444 | 6 |
| Text-davinci-003 with knowledge graph | OpenAI & The authors | 0.482 | 0.507 | 0.521 | 0.470 | 0.478 | 0.582 | 0.516 | 0.510 | 0.516 | 7 |
| Qianfan-Chinese-Llama-2-7B with knowledge graph| Baidu & The authors | 0.474 | 0.474 | 0.486 | 0.494 | 0.469 | 0.570 | 0.529 | 0.514 | 0.470 | 8 |
| ChatGLM2-6B with knowledge graph | Tsinghua, Zhipu.AI & The authors | 0.472 | 0.471 | 0.469 | 0.488 | 0.464 | 0.517 | 0.507 | 0.528 | 0.462 | 9 |
| ChatGLM2-6B | Tsinghua & Zhipu.AI | 0.430 | 0.454 | 0.412 | 0.477 | 0.409 | 0.469 | 0.444 | 0.494 | 0.420 | 10 |
| ChatGLM3-6B | Tsinghua & Zhipu.AI | 0.399 | 0.452 | 0.389 | 0.415 | 0.356 | 0.412 | 0.389 | 0.416 | 0.399 | 11 |
| Qianfan-Chinese-Llama-2-7B | Baidu | 0.373 | 0.421 | 0.377 | 0.364 | 0.359 | 0.422 | 0.374 | 0.411 | 0.358 | 12 |
| GPT-3.5-turbo | OpenAI | 0.348 | 0.422 | 0.317 | 0.368 | 0.322 | 0.438 | 0.332 | 0.405 | 0.333 | 13 |
| Llama-2-70b with knowledge graph | MetaAI & The authors | 0.377 | 0.335 | 0.369 | 0.323 | 0.328 | 0.414 | 0.354 | 0.335 | 0.332 | 14 |
| Text-davinci-003 | OpenAI | 0.328 | 0.351 | 0.318 | 0.343 | 0.334 | 0.382 | 0.343 | 0.361 | 0.341 | 15 |
| Llama-2-70b | MetaAI | 0.284 | 0.284 | 0.338 | 0.255 | 0.316 | 0.313 | 0.291 | 0.299 | 0.293 | 16 | | AnonymousSite/QA_dataset_for_CCLR | [
"license:mit",
"region:us"
] | 2024-01-18T12:33:01+00:00 | {"license": "mit"} | 2024-02-07T13:18:52+00:00 | [] | [] | TAGS
#license-mit #region-us
| 1. Introduction
===============
(1) As far as we know, this is the largest QA dataset for Chinese Construction Laws and Regulations (CCLR). For example, well-known datasets like c-eval typically contain only about 500 questions in a single domain, whereas our dataset specifically focuses on the CCLR domain and includes 6,339 questions..
(2) The dataset is developed and maintained by Southeast University, University of Cambridge, and City University of Hong Kong.
(3) Make sure to read the specification and follow the rules.
2. Submission of your LLM’s answers
===================================
The answers could be submitted through URL Please use “Template of answer URL” in this repository to submit your LLM's answers
3. Citation requirement
=======================
The reuse of this repository requires citation. Should any individual or entity utilize this repository without appropriate acknowledgment and citation, they do not have the right to use our data. We will take measures to protect our copyright, including, but not limited to, retracting their papers and initiating legal action.
4.LLM Leaderboard for CCLR QA
=============================
| [] | [
"TAGS\n#license-mit #region-us \n"
] |
50d746d284e3fc8dace81f5336439fe78f4cbe60 |
# Dataset Card for Evaluation run of liminerity/Blured-Ties-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blured-Ties-7B](https://huggingface.co/liminerity/Blured-Ties-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blured-Ties-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:35:33.368960](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blured-Ties-7B/blob/main/results_2024-01-18T12-35-33.368960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6318817875919636,
"acc_stderr": 0.03270403460584993,
"acc_norm": 0.6355003158859899,
"acc_norm_stderr": 0.03336663859733476,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5812148371245991,
"mc2_stderr": 0.015197448749237714
},
"harness|arc:challenge|25": {
"acc": 0.6160409556313993,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.6399317406143344,
"acc_norm_stderr": 0.01402751681458519
},
"harness|hellaswag|10": {
"acc": 0.6432981477793268,
"acc_stderr": 0.0047804672709117705,
"acc_norm": 0.8355905198167696,
"acc_norm_stderr": 0.0036988923883801024
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899136,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573695,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573695
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5812148371245991,
"mc2_stderr": 0.015197448749237714
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936636
},
"harness|gsm8k|5": {
"acc": 0.46929492039423804,
"acc_stderr": 0.013746490739560037
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liminerity__Blured-Ties-7B | [
"region:us"
] | 2024-01-18T12:37:52+00:00 | {"pretty_name": "Evaluation run of liminerity/Blured-Ties-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/Blured-Ties-7B](https://huggingface.co/liminerity/Blured-Ties-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blured-Ties-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T12:35:33.368960](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blured-Ties-7B/blob/main/results_2024-01-18T12-35-33.368960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6318817875919636,\n \"acc_stderr\": 0.03270403460584993,\n \"acc_norm\": 0.6355003158859899,\n \"acc_norm_stderr\": 0.03336663859733476,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5812148371245991,\n \"mc2_stderr\": 0.015197448749237714\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.01402751681458519\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6432981477793268,\n \"acc_stderr\": 0.0047804672709117705,\n \"acc_norm\": 0.8355905198167696,\n \"acc_norm_stderr\": 0.0036988923883801024\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.03761070869867479,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.03761070869867479\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899136,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573695,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573695\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5812148371245991,\n \"mc2_stderr\": 0.015197448749237714\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936636\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.46929492039423804,\n \"acc_stderr\": 0.013746490739560037\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/Blured-Ties-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-35-33.368960.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["**/details_harness|winogrande|5_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T12-35-33.368960.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T12_35_33.368960", "path": ["results_2024-01-18T12-35-33.368960.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T12-35-33.368960.parquet"]}]}]} | 2024-01-18T12:38:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liminerity/Blured-Ties-7B
Dataset automatically created during the evaluation run of model liminerity/Blured-Ties-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T12:35:33.368960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liminerity/Blured-Ties-7B\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blured-Ties-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:35:33.368960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liminerity/Blured-Ties-7B\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blured-Ties-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:35:33.368960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3b44ec6da806af2faa541f4161c6fac576260bf7 | # Dataset Card for "dataset-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mehdidc/dataset-test | [
"region:us"
] | 2024-01-18T12:43:35+00:00 | {"dataset_info": {"features": [{"name": "caption", "dtype": "string"}, {"name": "caption_source", "dtype": "string"}, {"name": "image_0_url", "dtype": "string"}, {"name": "image_1_url", "dtype": "string"}, {"name": "label_0", "dtype": "float64"}, {"name": "label_1", "dtype": "float64"}, {"name": "num_example_per_prompt", "dtype": "int64"}, {"name": "model_0", "dtype": "string"}, {"name": "model_1", "dtype": "string"}, {"name": "jpg_0", "dtype": "binary"}, {"name": "jpg_1", "dtype": "binary"}, {"name": "are_different", "dtype": "bool"}, {"name": "has_label", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 292907, "num_examples": 1}], "download_size": 300728, "dataset_size": 292907}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T13:19:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dataset-test"
More Information needed | [
"# Dataset Card for \"dataset-test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dataset-test\"\n\nMore Information needed"
] |
16869c704301d1f37cc778f876f0440f68185a83 |
# Dataset Card for Evaluation run of leveldevai/TurdusDareBeagle-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leveldevai/TurdusDareBeagle-7B](https://huggingface.co/leveldevai/TurdusDareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:52:49.102510](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B/blob/main/results_2024-01-18T12-52-49.102510.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6547607632913887,
"acc_stderr": 0.03205617544070551,
"acc_norm": 0.6539975555906383,
"acc_norm_stderr": 0.0327278172321473,
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743183,
"mc2": 0.6889794032014356,
"mc2_stderr": 0.015072581970460247
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725223,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635753
},
"harness|hellaswag|10": {
"acc": 0.7157936666002789,
"acc_stderr": 0.004501137895230723,
"acc_norm": 0.8844851623182632,
"acc_norm_stderr": 0.0031898897894046723
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653345,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5556915544675642,
"mc1_stderr": 0.017394586250743183,
"mc2": 0.6889794032014356,
"mc2_stderr": 0.015072581970460247
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.7073540561031084,
"acc_stderr": 0.012532334368242888
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B | [
"region:us"
] | 2024-01-18T12:55:07+00:00 | {"pretty_name": "Evaluation run of leveldevai/TurdusDareBeagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [leveldevai/TurdusDareBeagle-7B](https://huggingface.co/leveldevai/TurdusDareBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T12:52:49.102510](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__TurdusDareBeagle-7B/blob/main/results_2024-01-18T12-52-49.102510.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6547607632913887,\n \"acc_stderr\": 0.03205617544070551,\n \"acc_norm\": 0.6539975555906383,\n \"acc_norm_stderr\": 0.0327278172321473,\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743183,\n \"mc2\": 0.6889794032014356,\n \"mc2_stderr\": 0.015072581970460247\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635753\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7157936666002789,\n \"acc_stderr\": 0.004501137895230723,\n \"acc_norm\": 0.8844851623182632,\n \"acc_norm_stderr\": 0.0031898897894046723\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653345,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653345\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5556915544675642,\n \"mc1_stderr\": 0.017394586250743183,\n \"mc2\": 0.6889794032014356,\n \"mc2_stderr\": 0.015072581970460247\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \"acc_stderr\": 0.012532334368242888\n }\n}\n```", "repo_url": "https://huggingface.co/leveldevai/TurdusDareBeagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["**/details_harness|winogrande|5_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T12-52-49.102510.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T12_52_49.102510", "path": ["results_2024-01-18T12-52-49.102510.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T12-52-49.102510.parquet"]}]}]} | 2024-01-18T12:55:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of leveldevai/TurdusDareBeagle-7B
Dataset automatically created during the evaluation run of model leveldevai/TurdusDareBeagle-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T12:52:49.102510(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of leveldevai/TurdusDareBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/TurdusDareBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:52:49.102510(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of leveldevai/TurdusDareBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/TurdusDareBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:52:49.102510(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ffd83acdc4b83e687a4b97b81e1a923199e716fb |
# Dataset Card for Evaluation run of Cartinoe5930/iDUS-8layers
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Cartinoe5930/iDUS-8layers](https://huggingface.co/Cartinoe5930/iDUS-8layers) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Cartinoe5930__iDUS-8layers",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:59:32.021751](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__iDUS-8layers/blob/main/results_2024-01-18T12-59-32.021751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6274188914519915,
"acc_stderr": 0.03268111082607906,
"acc_norm": 0.6345805741448237,
"acc_norm_stderr": 0.033354460317565934,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.40619217776266314,
"mc2_stderr": 0.014256400476902515
},
"harness|arc:challenge|25": {
"acc": 0.560580204778157,
"acc_stderr": 0.014503747823580123,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009123
},
"harness|hellaswag|10": {
"acc": 0.6199960167297351,
"acc_stderr": 0.004843954338451447,
"acc_norm": 0.8133837880900219,
"acc_norm_stderr": 0.0038880689432920744
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399327,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.015652542496421125,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.015652542496421125
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426125,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426125
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.01271540484127774,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.01271540484127774
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417482,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417482
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.40619217776266314,
"mc2_stderr": 0.014256400476902515
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
},
"harness|gsm8k|5": {
"acc": 0.29567854435178165,
"acc_stderr": 0.01257006894789878
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Cartinoe5930__iDUS-8layers | [
"region:us"
] | 2024-01-18T13:01:49+00:00 | {"pretty_name": "Evaluation run of Cartinoe5930/iDUS-8layers", "dataset_summary": "Dataset automatically created during the evaluation run of model [Cartinoe5930/iDUS-8layers](https://huggingface.co/Cartinoe5930/iDUS-8layers) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cartinoe5930__iDUS-8layers\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T12:59:32.021751](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__iDUS-8layers/blob/main/results_2024-01-18T12-59-32.021751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6274188914519915,\n \"acc_stderr\": 0.03268111082607906,\n \"acc_norm\": 0.6345805741448237,\n \"acc_norm_stderr\": 0.033354460317565934,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.40619217776266314,\n \"mc2_stderr\": 0.014256400476902515\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.560580204778157,\n \"acc_stderr\": 0.014503747823580123,\n \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009123\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6199960167297351,\n \"acc_stderr\": 0.004843954338451447,\n \"acc_norm\": 0.8133837880900219,\n \"acc_norm_stderr\": 0.0038880689432920744\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399327,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399327\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n \"acc_stderr\": 0.015652542496421125,\n \"acc_norm\": 0.3240223463687151,\n \"acc_norm_stderr\": 0.015652542496421125\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426125,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426125\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.01271540484127774,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.01271540484127774\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417482,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417482\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.40619217776266314,\n \"mc2_stderr\": 0.014256400476902515\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29567854435178165,\n \"acc_stderr\": 0.01257006894789878\n }\n}\n```", "repo_url": "https://huggingface.co/Cartinoe5930/iDUS-8layers", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T12-59-32.021751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["**/details_harness|winogrande|5_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T12-59-32.021751.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T12_59_32.021751", "path": ["results_2024-01-18T12-59-32.021751.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T12-59-32.021751.parquet"]}]}]} | 2024-01-18T13:02:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Cartinoe5930/iDUS-8layers
Dataset automatically created during the evaluation run of model Cartinoe5930/iDUS-8layers on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T12:59:32.021751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Cartinoe5930/iDUS-8layers\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/iDUS-8layers on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:59:32.021751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Cartinoe5930/iDUS-8layers\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/iDUS-8layers on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T12:59:32.021751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4abf0a815098269ab209171e0015a7201cbde3b0 |
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.2](https://huggingface.co/liminerity/Blur-7b-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:00:27.961191](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.2/blob/main/results_2024-01-18T13-00-27.961191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6357129950975389,
"acc_stderr": 0.03262066192131251,
"acc_norm": 0.6382762311799055,
"acc_norm_stderr": 0.03328259277014658,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6030326315591199,
"mc2_stderr": 0.015260409379504259
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.01414419347189345,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063223
},
"harness|hellaswag|10": {
"acc": 0.6528579964150567,
"acc_stderr": 0.00475088440109516,
"acc_norm": 0.8387771360286795,
"acc_norm_stderr": 0.0036698484004877773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091095,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464073,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464073
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.016706617522176136,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.016706617522176136
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032207,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6030326315591199,
"mc2_stderr": 0.015260409379504259
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.01111698339239267
},
"harness|gsm8k|5": {
"acc": 0.5284306292645944,
"acc_stderr": 0.013750202076584422
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liminerity__Blur-7b-v1.2 | [
"region:us"
] | 2024-01-18T13:02:47+00:00 | {"pretty_name": "Evaluation run of liminerity/Blur-7b-v1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.2](https://huggingface.co/liminerity/Blur-7b-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-v1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:00:27.961191](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.2/blob/main/results_2024-01-18T13-00-27.961191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6357129950975389,\n \"acc_stderr\": 0.03262066192131251,\n \"acc_norm\": 0.6382762311799055,\n \"acc_norm_stderr\": 0.03328259277014658,\n \"mc1\": 0.43451652386780903,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6030326315591199,\n \"mc2_stderr\": 0.015260409379504259\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.01414419347189345,\n \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063223\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n \"acc_stderr\": 0.00475088440109516,\n \"acc_norm\": 0.8387771360286795,\n \"acc_norm_stderr\": 0.0036698484004877773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091095,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464073,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464073\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n \"acc_stderr\": 0.016706617522176136,\n \"acc_norm\": 0.4782122905027933,\n \"acc_norm_stderr\": 0.016706617522176136\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032207,\n \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032207\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6030326315591199,\n \"mc2_stderr\": 0.015260409379504259\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.01111698339239267\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5284306292645944,\n \"acc_stderr\": 0.013750202076584422\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/Blur-7b-v1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["**/details_harness|winogrande|5_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-00-27.961191.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_00_27.961191", "path": ["results_2024-01-18T13-00-27.961191.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-00-27.961191.parquet"]}]}]} | 2024-01-18T13:03:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.2
Dataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:00:27.961191(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.2\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:00:27.961191(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.2\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:00:27.961191(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0acedbbc2ed8c3fbcb5aa5648f9271f32f8404a9 |
# Dataset Card for Evaluation run of Charlie911/MultiLora-sharegpt
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Charlie911/MultiLora-sharegpt](https://huggingface.co/Charlie911/MultiLora-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__MultiLora-sharegpt",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:16:53.063805](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLora-sharegpt/blob/main/results_2024-01-18T13-16-53.063805.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38101030685997583,
"acc_stderr": 0.03410831671987077,
"acc_norm": 0.3855147425687915,
"acc_norm_stderr": 0.03494756046521787,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.45851126549802557,
"mc2_stderr": 0.015260986256788305
},
"harness|arc:challenge|25": {
"acc": 0.41467576791808874,
"acc_stderr": 0.014397070564409172,
"acc_norm": 0.4564846416382253,
"acc_norm_stderr": 0.014555949760496435
},
"harness|hellaswag|10": {
"acc": 0.4879506074487154,
"acc_stderr": 0.004988332289642083,
"acc_norm": 0.6554471220872337,
"acc_norm_stderr": 0.004742510354777903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353229,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353229
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4339622641509434,
"acc_stderr": 0.03050329201334259,
"acc_norm": 0.4339622641509434,
"acc_norm_stderr": 0.03050329201334259
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47096774193548385,
"acc_stderr": 0.028396016402761008,
"acc_norm": 0.47096774193548385,
"acc_norm_stderr": 0.028396016402761008
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03902551007374449,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03902551007374449
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.03559443565563919,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.03559443565563919
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5181347150259067,
"acc_stderr": 0.03606065001832919,
"acc_norm": 0.5181347150259067,
"acc_norm_stderr": 0.03606065001832919
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.024962683564331803,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.024962683564331803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.47706422018348627,
"acc_stderr": 0.0214147570581755,
"acc_norm": 0.47706422018348627,
"acc_norm_stderr": 0.0214147570581755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4430379746835443,
"acc_stderr": 0.03233532777533484,
"acc_norm": 0.4430379746835443,
"acc_norm_stderr": 0.03233532777533484
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.40458015267175573,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.40458015267175573,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030049,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030049
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.03731133519673891,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.03731133519673891
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.42718446601941745,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.42718446601941745,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4700854700854701,
"acc_stderr": 0.032697411068124425,
"acc_norm": 0.4700854700854701,
"acc_norm_stderr": 0.032697411068124425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.49808429118773945,
"acc_stderr": 0.017879832259026677,
"acc_norm": 0.49808429118773945,
"acc_norm_stderr": 0.017879832259026677
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.025416003773165566,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.025416003773165566
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249596,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.42443729903536975,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.42443729903536975,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.027648477877413327,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.027648477877413327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28552803129074317,
"acc_stderr": 0.011535751586665664,
"acc_norm": 0.28552803129074317,
"acc_norm_stderr": 0.011535751586665664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.019412539242032168,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.019412539242032168
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.047093069786618966,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.047093069786618966
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4163265306122449,
"acc_stderr": 0.03155782816556165,
"acc_norm": 0.4163265306122449,
"acc_norm_stderr": 0.03155782816556165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4925373134328358,
"acc_stderr": 0.03535140084276719,
"acc_norm": 0.4925373134328358,
"acc_norm_stderr": 0.03535140084276719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.035294868015111155,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.035294868015111155
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4678362573099415,
"acc_stderr": 0.038268824176603676,
"acc_norm": 0.4678362573099415,
"acc_norm_stderr": 0.038268824176603676
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.45851126549802557,
"mc2_stderr": 0.015260986256788305
},
"harness|winogrande|5": {
"acc": 0.6661404893449092,
"acc_stderr": 0.013254029695143348
},
"harness|gsm8k|5": {
"acc": 0.039423805913570885,
"acc_stderr": 0.005360280030342458
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Charlie911__MultiLora-sharegpt | [
"region:us"
] | 2024-01-18T13:19:18+00:00 | {"pretty_name": "Evaluation run of Charlie911/MultiLora-sharegpt", "dataset_summary": "Dataset automatically created during the evaluation run of model [Charlie911/MultiLora-sharegpt](https://huggingface.co/Charlie911/MultiLora-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__MultiLora-sharegpt\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:16:53.063805](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLora-sharegpt/blob/main/results_2024-01-18T13-16-53.063805.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38101030685997583,\n \"acc_stderr\": 0.03410831671987077,\n \"acc_norm\": 0.3855147425687915,\n \"acc_norm_stderr\": 0.03494756046521787,\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.45851126549802557,\n \"mc2_stderr\": 0.015260986256788305\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.41467576791808874,\n \"acc_stderr\": 0.014397070564409172,\n \"acc_norm\": 0.4564846416382253,\n \"acc_norm_stderr\": 0.014555949760496435\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4879506074487154,\n \"acc_stderr\": 0.004988332289642083,\n \"acc_norm\": 0.6554471220872337,\n \"acc_norm_stderr\": 0.004742510354777903\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04244633238353229,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04244633238353229\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4339622641509434,\n \"acc_stderr\": 0.03050329201334259,\n \"acc_norm\": 0.4339622641509434,\n \"acc_norm_stderr\": 0.03050329201334259\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47096774193548385,\n \"acc_stderr\": 0.028396016402761008,\n \"acc_norm\": 0.47096774193548385,\n \"acc_norm_stderr\": 0.028396016402761008\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374449,\n \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374449\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4797979797979798,\n \"acc_stderr\": 0.03559443565563919,\n \"acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.03559443565563919\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5181347150259067,\n \"acc_stderr\": 0.03606065001832919,\n \"acc_norm\": 0.5181347150259067,\n \"acc_norm_stderr\": 0.03606065001832919\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.024962683564331803,\n \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.024962683564331803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.031918633744784645,\n \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.031918633744784645\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.47706422018348627,\n \"acc_stderr\": 0.0214147570581755,\n \"acc_norm\": 0.47706422018348627,\n \"acc_norm_stderr\": 0.0214147570581755\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4430379746835443,\n \"acc_stderr\": 0.03233532777533484,\n \"acc_norm\": 0.4430379746835443,\n \"acc_norm_stderr\": 0.03233532777533484\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.39669421487603307,\n \"acc_stderr\": 0.044658697805310094,\n \"acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.044658697805310094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.04524596007030049,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.04524596007030049\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.03731133519673891,\n \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.03731133519673891\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4700854700854701,\n \"acc_stderr\": 0.032697411068124425,\n \"acc_norm\": 0.4700854700854701,\n \"acc_norm_stderr\": 0.032697411068124425\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.49808429118773945,\n \"acc_stderr\": 0.017879832259026677,\n \"acc_norm\": 0.49808429118773945,\n \"acc_norm_stderr\": 0.017879832259026677\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.025416003773165566,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.025416003773165566\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249596,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249596\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.02824513402438729,\n \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.02824513402438729\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.42443729903536975,\n \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.42443729903536975,\n \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.027648477877413327,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.027648477877413327\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28552803129074317,\n \"acc_stderr\": 0.011535751586665664,\n \"acc_norm\": 0.28552803129074317,\n \"acc_norm_stderr\": 0.011535751586665664\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.019412539242032168,\n \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.019412539242032168\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n \"acc_stderr\": 0.047093069786618966,\n \"acc_norm\": 0.4090909090909091,\n \"acc_norm_stderr\": 0.047093069786618966\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4163265306122449,\n \"acc_stderr\": 0.03155782816556165,\n \"acc_norm\": 0.4163265306122449,\n \"acc_norm_stderr\": 0.03155782816556165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4925373134328358,\n \"acc_stderr\": 0.03535140084276719,\n \"acc_norm\": 0.4925373134328358,\n \"acc_norm_stderr\": 0.03535140084276719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.035294868015111155,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.035294868015111155\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4678362573099415,\n \"acc_stderr\": 0.038268824176603676,\n \"acc_norm\": 0.4678362573099415,\n \"acc_norm_stderr\": 0.038268824176603676\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.45851126549802557,\n \"mc2_stderr\": 0.015260986256788305\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6661404893449092,\n \"acc_stderr\": 0.013254029695143348\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.039423805913570885,\n \"acc_stderr\": 0.005360280030342458\n }\n}\n```", "repo_url": "https://huggingface.co/Charlie911/MultiLora-sharegpt", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-16-53.063805.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["**/details_harness|winogrande|5_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-16-53.063805.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_16_53.063805", "path": ["results_2024-01-18T13-16-53.063805.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-16-53.063805.parquet"]}]}]} | 2024-01-18T13:19:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Charlie911/MultiLora-sharegpt
Dataset automatically created during the evaluation run of model Charlie911/MultiLora-sharegpt on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:16:53.063805(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Charlie911/MultiLora-sharegpt\n\n\n\nDataset automatically created during the evaluation run of model Charlie911/MultiLora-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:16:53.063805(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Charlie911/MultiLora-sharegpt\n\n\n\nDataset automatically created during the evaluation run of model Charlie911/MultiLora-sharegpt on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:16:53.063805(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7f1dff5b9d5a08a7bc40458f78a3c203e596cc4d |
This dataset is based on [neovalle/H4rmony](https://huggingface.co/datasets/neovalle/H4rmony), and optimised to the format required by DPOTrainer from the trl library. | neovalle/H4rmony_dpo | [
"task_categories:question-answering",
"task_categories:text-classification",
"task_categories:reinforcement-learning",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"license:mit",
"ecolinguistics",
"ecology",
"sustainability",
"environment",
"synthetic",
"region:us"
] | 2024-01-18T13:20:21+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-classification", "reinforcement-learning", "text-generation"], "tags": ["ecolinguistics", "ecology", "sustainability", "environment", "synthetic"]} | 2024-02-05T15:03:58+00:00 | [] | [] | TAGS
#task_categories-question-answering #task_categories-text-classification #task_categories-reinforcement-learning #task_categories-text-generation #size_categories-1K<n<10K #license-mit #ecolinguistics #ecology #sustainability #environment #synthetic #region-us
|
This dataset is based on neovalle/H4rmony, and optimised to the format required by DPOTrainer from the trl library. | [] | [
"TAGS\n#task_categories-question-answering #task_categories-text-classification #task_categories-reinforcement-learning #task_categories-text-generation #size_categories-1K<n<10K #license-mit #ecolinguistics #ecology #sustainability #environment #synthetic #region-us \n"
] |
da840bf8288e0eb8945417b76dfee4df29a295d9 |
# Dataset Card for Evaluation run of chargoddard/internlm2-20b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/internlm2-20b-llama](https://huggingface.co/chargoddard/internlm2-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__internlm2-20b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T17:52:12.379059](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-20b-llama/blob/main/results_2024-01-18T17-52-12.379059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6745878141840577,
"acc_stderr": 0.03167073179485568,
"acc_norm": 0.6749429408040124,
"acc_norm_stderr": 0.032336180832304856,
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5413315935552325,
"mc2_stderr": 0.015717993914435745
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251098,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756558
},
"harness|hellaswag|10": {
"acc": 0.6414060944035053,
"acc_stderr": 0.004786075107572185,
"acc_norm": 0.8312089225253934,
"acc_norm_stderr": 0.00373801773403787
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7396226415094339,
"acc_stderr": 0.027008766090708045,
"acc_norm": 0.7396226415094339,
"acc_norm_stderr": 0.027008766090708045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059004,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059004
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.0402873153294756,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.0402873153294756
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5105820105820106,
"acc_stderr": 0.02574554227604549,
"acc_norm": 0.5105820105820106,
"acc_norm_stderr": 0.02574554227604549
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632158,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632158
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562066,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562066
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645354,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646508,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646508
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7436974789915967,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.7436974789915967,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649412,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649412
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595694,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508783,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508783
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194164,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194164
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899115,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899115
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.0247238615047717,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.0247238615047717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.02346842983245114,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.02346842983245114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5058670143415906,
"acc_stderr": 0.012769356925216526,
"acc_norm": 0.5058670143415906,
"acc_norm_stderr": 0.012769356925216526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5413315935552325,
"mc2_stderr": 0.015717993914435745
},
"harness|winogrande|5": {
"acc": 0.8421468034727704,
"acc_stderr": 0.010247165248719763
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.01254183081546149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chargoddard__internlm2-20b-llama | [
"region:us"
] | 2024-01-18T13:20:48+00:00 | {"pretty_name": "Evaluation run of chargoddard/internlm2-20b-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/internlm2-20b-llama](https://huggingface.co/chargoddard/internlm2-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__internlm2-20b-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T17:52:12.379059](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-20b-llama/blob/main/results_2024-01-18T17-52-12.379059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6745878141840577,\n \"acc_stderr\": 0.03167073179485568,\n \"acc_norm\": 0.6749429408040124,\n \"acc_norm_stderr\": 0.032336180832304856,\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5413315935552325,\n \"mc2_stderr\": 0.015717993914435745\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251098,\n \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756558\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6414060944035053,\n \"acc_stderr\": 0.004786075107572185,\n \"acc_norm\": 0.8312089225253934,\n \"acc_norm_stderr\": 0.00373801773403787\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7396226415094339,\n \"acc_stderr\": 0.027008766090708045,\n \"acc_norm\": 0.7396226415094339,\n \"acc_norm_stderr\": 0.027008766090708045\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059004,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059004\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745647,\n \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745647\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.0402873153294756,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.0402873153294756\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604549,\n \"acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604549\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n \"acc_stderr\": 0.02141724293632158,\n \"acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.02141724293632158\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562066,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562066\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645354,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646508,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646508\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7436974789915967,\n \"acc_stderr\": 0.02835962087053395,\n \"acc_norm\": 0.7436974789915967,\n \"acc_norm_stderr\": 0.02835962087053395\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649412,\n \"acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649412\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595694,\n \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508783,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508783\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.01872430174194164,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.01872430174194164\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899115,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899115\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958157,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958157\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n \"acc_stderr\": 0.0247238615047717,\n \"acc_norm\": 0.7459807073954984,\n \"acc_norm_stderr\": 0.0247238615047717\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.02346842983245114,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.02346842983245114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5058670143415906,\n \"acc_stderr\": 0.012769356925216526,\n \"acc_norm\": 0.5058670143415906,\n \"acc_norm_stderr\": 0.012769356925216526\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.025206963154225395,\n \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.025206963154225395\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5413315935552325,\n \"mc2_stderr\": 0.015717993914435745\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \"acc_stderr\": 0.01254183081546149\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/internlm2-20b-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|arc:challenge|25_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|gsm8k|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hellaswag|10_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-18-39.754211.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["**/details_harness|winogrande|5_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["**/details_harness|winogrande|5_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T17-52-12.379059.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_18_39.754211", "path": ["results_2024-01-18T13-18-39.754211.parquet"]}, {"split": "2024_01_18T17_52_12.379059", "path": ["results_2024-01-18T17-52-12.379059.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T17-52-12.379059.parquet"]}]}]} | 2024-01-18T17:54:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/internlm2-20b-llama
Dataset automatically created during the evaluation run of model chargoddard/internlm2-20b-llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T17:52:12.379059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of chargoddard/internlm2-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T17:52:12.379059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/internlm2-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/internlm2-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T17:52:12.379059(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
36c08001921268a86a764260e1dba2d2ec137ac9 |
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.21
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.21](https://huggingface.co/liminerity/Blur-7b-v1.21) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-v1.21",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:28:00.366540](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.21/blob/main/results_2024-01-18T13-28-00.366540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6540458763545218,
"acc_stderr": 0.032093019516955965,
"acc_norm": 0.6534601787133112,
"acc_norm_stderr": 0.032764115724543935,
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960867,
"mc2": 0.6799010994882542,
"mc2_stderr": 0.01527627642493985
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726291,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.01328452529240352
},
"harness|hellaswag|10": {
"acc": 0.712109141605258,
"acc_stderr": 0.004518546274738885,
"acc_norm": 0.8807010555666202,
"acc_norm_stderr": 0.003234774980647951
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887027,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887027
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846178,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846178
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608313,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4692737430167598,
"acc_stderr": 0.016690896161944385,
"acc_norm": 0.4692737430167598,
"acc_norm_stderr": 0.016690896161944385
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897227,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897227
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960867,
"mc2": 0.6799010994882542,
"mc2_stderr": 0.01527627642493985
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292406
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.01267929754951543
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liminerity__Blur-7b-v1.21 | [
"region:us"
] | 2024-01-18T13:30:37+00:00 | {"pretty_name": "Evaluation run of liminerity/Blur-7b-v1.21", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.21](https://huggingface.co/liminerity/Blur-7b-v1.21) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-v1.21\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:28:00.366540](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.21/blob/main/results_2024-01-18T13-28-00.366540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6540458763545218,\n \"acc_stderr\": 0.032093019516955965,\n \"acc_norm\": 0.6534601787133112,\n \"acc_norm_stderr\": 0.032764115724543935,\n \"mc1\": 0.5397796817625459,\n \"mc1_stderr\": 0.017448017223960867,\n \"mc2\": 0.6799010994882542,\n \"mc2_stderr\": 0.01527627642493985\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726291,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.01328452529240352\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.712109141605258,\n \"acc_stderr\": 0.004518546274738885,\n \"acc_norm\": 0.8807010555666202,\n \"acc_norm_stderr\": 0.003234774980647951\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887027,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887027\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846178,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846178\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4692737430167598,\n \"acc_stderr\": 0.016690896161944385,\n \"acc_norm\": 0.4692737430167598,\n \"acc_norm_stderr\": 0.016690896161944385\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5397796817625459,\n \"mc1_stderr\": 0.017448017223960867,\n \"mc2\": 0.6799010994882542,\n \"mc2_stderr\": 0.01527627642493985\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \"acc_stderr\": 0.01267929754951543\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/Blur-7b-v1.21", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["**/details_harness|winogrande|5_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-28-00.366540.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_28_00.366540", "path": ["results_2024-01-18T13-28-00.366540.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-28-00.366540.parquet"]}]}]} | 2024-01-18T13:31:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.21
Dataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.21 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:28:00.366540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.21\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.21 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:28:00.366540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.21\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.21 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:28:00.366540(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9183981194ba1098e5237c16b34f3993521bcac0 |
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.1](https://huggingface.co/xriminact/TarsChattyBasev0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:28:46.282791](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1/blob/main/results_2024-01-18T13-28-46.282791.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5543673608069786,
"acc_stderr": 0.03400787921234687,
"acc_norm": 0.5627560913076616,
"acc_norm_stderr": 0.034799828523172316,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.41412077793772695,
"mc2_stderr": 0.014661007860915117
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.014518421825670444,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809174
},
"harness|hellaswag|10": {
"acc": 0.6289583748257319,
"acc_stderr": 0.004820962855749738,
"acc_norm": 0.8241386178052181,
"acc_norm_stderr": 0.0037992414085029564
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983045,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411894,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165635,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165635
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461213,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461213
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.019342036587702588,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.019342036587702588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.029936696387138598,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.029936696387138598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527827,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527827
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.02795604616542452,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.02795604616542452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.027466610213140112,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.027466610213140112
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871588,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41134289439374183,
"acc_stderr": 0.01256788267380368,
"acc_norm": 0.41134289439374183,
"acc_norm_stderr": 0.01256788267380368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.046313813194254656,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.046313813194254656
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087555,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087555
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219371,
"mc2": 0.41412077793772695,
"mc2_stderr": 0.014661007860915117
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011874
},
"harness|gsm8k|5": {
"acc": 0.09401061410159212,
"acc_stderr": 0.008038819818872452
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1 | [
"region:us"
] | 2024-01-18T13:31:07+00:00 | {"pretty_name": "Evaluation run of xriminact/TarsChattyBasev0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.1](https://huggingface.co/xriminact/TarsChattyBasev0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:28:46.282791](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.1/blob/main/results_2024-01-18T13-28-46.282791.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5543673608069786,\n \"acc_stderr\": 0.03400787921234687,\n \"acc_norm\": 0.5627560913076616,\n \"acc_norm_stderr\": 0.034799828523172316,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.41412077793772695,\n \"mc2_stderr\": 0.014661007860915117\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.014518421825670444,\n \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6289583748257319,\n \"acc_stderr\": 0.004820962855749738,\n \"acc_norm\": 0.8241386178052181,\n \"acc_norm_stderr\": 0.0037992414085029564\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983045,\n \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983045\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411894,\n \"acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165635,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165635\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845426,\n \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845426\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461213,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461213\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7155963302752294,\n \"acc_stderr\": 0.019342036587702588,\n \"acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.019342036587702588\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6962025316455697,\n \"acc_stderr\": 0.029936696387138598,\n \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.029936696387138598\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.7381864623243933,\n \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n \"acc_stderr\": 0.014635185616527827,\n \"acc_norm\": 0.2581005586592179,\n \"acc_norm_stderr\": 0.014635185616527827\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.02795604616542452,\n \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.02795604616542452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n \"acc_stderr\": 0.027466610213140112,\n \"acc_norm\": 0.6270096463022508,\n \"acc_norm_stderr\": 0.027466610213140112\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871588,\n \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871588\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n \"acc_stderr\": 0.01256788267380368,\n \"acc_norm\": 0.41134289439374183,\n \"acc_norm_stderr\": 0.01256788267380368\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003483,\n \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003483\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887184,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887184\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.046313813194254656,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.046313813194254656\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087555,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087555\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219371,\n \"mc2\": 0.41412077793772695,\n \"mc2_stderr\": 0.014661007860915117\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011874\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09401061410159212,\n \"acc_stderr\": 0.008038819818872452\n }\n}\n```", "repo_url": "https://huggingface.co/xriminact/TarsChattyBasev0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["**/details_harness|winogrande|5_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-28-46.282791.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_28_46.282791", "path": ["results_2024-01-18T13-28-46.282791.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-28-46.282791.parquet"]}]}]} | 2024-01-18T13:31:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.1
Dataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:28:46.282791(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.1\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:28:46.282791(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.1\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:28:46.282791(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2b137b5cf91bf05d9f97feb296b4549f1d627f12 |
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppostep_100
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b_ppostep_100](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppostep_100) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppostep_100",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:32:11.965537](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppostep_100/blob/main/results_2024-01-18T13-32-11.965537.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23774405306004376,
"acc_stderr": 0.030137212407426187,
"acc_norm": 0.23884860895654994,
"acc_norm_stderr": 0.030942263352296245,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862661,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.2167235494880546,
"acc_stderr": 0.012040156713481189,
"acc_norm": 0.29266211604095566,
"acc_norm_stderr": 0.013295916103619397
},
"harness|hellaswag|10": {
"acc": 0.25562636924915355,
"acc_stderr": 0.004353212146198442,
"acc_norm": 0.25871340370444135,
"acc_norm_stderr": 0.004370328224831795
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.02575755989310678,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.02575755989310678
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.18497109826589594,
"acc_stderr": 0.02960562398177122,
"acc_norm": 0.18497109826589594,
"acc_norm_stderr": 0.02960562398177122
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220575,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220575
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180267,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180267
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.02811209121011745,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.02811209121011745
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24615384615384617,
"acc_stderr": 0.021840866990423095,
"acc_norm": 0.24615384615384617,
"acc_norm_stderr": 0.021840866990423095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507385,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507385
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361266,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767478,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767478
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2848020434227331,
"acc_stderr": 0.016139174096522574,
"acc_norm": 0.2848020434227331,
"acc_norm_stderr": 0.016139174096522574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.022289638852617904,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.022289638852617904
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808855,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808855
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818704,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818704
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180844,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721377,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721377
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.19402985074626866,
"acc_stderr": 0.027962677604768924,
"acc_norm": 0.19402985074626866,
"acc_norm_stderr": 0.027962677604768924
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862661,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.489344909234412,
"acc_stderr": 0.014049294536290403
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppostep_100 | [
"region:us"
] | 2024-01-18T13:34:31+00:00 | {"pretty_name": "Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppostep_100", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b_ppostep_100](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppostep_100) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppostep_100\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:32:11.965537](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b_ppostep_100/blob/main/results_2024-01-18T13-32-11.965537.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23774405306004376,\n \"acc_stderr\": 0.030137212407426187,\n \"acc_norm\": 0.23884860895654994,\n \"acc_norm_stderr\": 0.030942263352296245,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2167235494880546,\n \"acc_stderr\": 0.012040156713481189,\n \"acc_norm\": 0.29266211604095566,\n \"acc_norm_stderr\": 0.013295916103619397\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25562636924915355,\n \"acc_stderr\": 0.004353212146198442,\n \"acc_norm\": 0.25871340370444135,\n \"acc_norm_stderr\": 0.004370328224831795\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.02575755989310678,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.02575755989310678\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.18497109826589594,\n \"acc_stderr\": 0.02960562398177122,\n \"acc_norm\": 0.18497109826589594,\n \"acc_norm_stderr\": 0.02960562398177122\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220575,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220575\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180267,\n \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180267\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.1919191919191919,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.02811209121011745,\n \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.02811209121011745\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24615384615384617,\n \"acc_stderr\": 0.021840866990423095,\n \"acc_norm\": 0.24615384615384617,\n \"acc_norm_stderr\": 0.021840866990423095\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507385,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507385\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361266,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361266\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767478,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767478\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.3811659192825112,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2848020434227331,\n \"acc_stderr\": 0.016139174096522574,\n \"acc_norm\": 0.2848020434227331,\n \"acc_norm_stderr\": 0.016139174096522574\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.022289638852617904,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.022289638852617904\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808855,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808855\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n \"acc_stderr\": 0.024185150647818704,\n \"acc_norm\": 0.2379421221864952,\n \"acc_norm_stderr\": 0.024185150647818704\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.04069306319721377,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.04069306319721377\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.19402985074626866,\n \"acc_stderr\": 0.027962677604768924,\n \"acc_norm\": 0.19402985074626866,\n \"acc_norm_stderr\": 0.027962677604768924\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.489344909234412,\n \"acc_stderr\": 0.014049294536290403\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b_ppostep_100", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-32-11.965537.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["**/details_harness|winogrande|5_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-32-11.965537.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_32_11.965537", "path": ["results_2024-01-18T13-32-11.965537.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-32-11.965537.parquet"]}]}]} | 2024-01-18T13:34:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppostep_100
Dataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppostep_100 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:32:11.965537(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppostep_100\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppostep_100 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:32:11.965537(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b_ppostep_100\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/alignment-handbook-zephyr-7b_ppostep_100 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:32:11.965537(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
67303255447f1738da5ae744f6bd623cf8da857e |
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.0](https://huggingface.co/xriminact/TarsChattyBasev0.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xriminact__TarsChattyBasev0.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:38:14.654966](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.0/blob/main/results_2024-01-18T13-38-14.654966.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5779292674272114,
"acc_stderr": 0.033798356990108386,
"acc_norm": 0.5861112058905269,
"acc_norm_stderr": 0.03457519469727729,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6170984357524522,
"mc2_stderr": 0.015233013125868297
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.01415702255540716,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.6416052579167496,
"acc_stderr": 0.004785488626807578,
"acc_norm": 0.8457478589922326,
"acc_norm_stderr": 0.0036045210852464343
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936338,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936338
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752052,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752052
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.0259060870213193,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.0259060870213193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.03097543638684542,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.03097543638684542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.031753678460966266,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.031753678460966266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.0454160944650395,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.0454160944650395
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665227,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665227
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691814,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691814
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.02572280220089582,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.02572280220089582
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32625698324022345,
"acc_stderr": 0.015680441518889178,
"acc_norm": 0.32625698324022345,
"acc_norm_stderr": 0.015680441518889178
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971646,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41134289439374183,
"acc_stderr": 0.012567882673803682,
"acc_norm": 0.41134289439374183,
"acc_norm_stderr": 0.012567882673803682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5964052287581699,
"acc_stderr": 0.019848280168401147,
"acc_norm": 0.5964052287581699,
"acc_norm_stderr": 0.019848280168401147
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789848,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789848
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6170984357524522,
"mc2_stderr": 0.015233013125868297
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.01152446695409026
},
"harness|gsm8k|5": {
"acc": 0.11675511751326763,
"acc_stderr": 0.0088454681369191
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xriminact__TarsChattyBasev0.0 | [
"region:us"
] | 2024-01-18T13:40:38+00:00 | {"pretty_name": "Evaluation run of xriminact/TarsChattyBasev0.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.0](https://huggingface.co/xriminact/TarsChattyBasev0.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xriminact__TarsChattyBasev0.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:38:14.654966](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.0/blob/main/results_2024-01-18T13-38-14.654966.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5779292674272114,\n \"acc_stderr\": 0.033798356990108386,\n \"acc_norm\": 0.5861112058905269,\n \"acc_norm_stderr\": 0.03457519469727729,\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6170984357524522,\n \"mc2_stderr\": 0.015233013125868297\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.01415702255540716,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6416052579167496,\n \"acc_stderr\": 0.004785488626807578,\n \"acc_norm\": 0.8457478589922326,\n \"acc_norm_stderr\": 0.0036045210852464343\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936525,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936338,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936338\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752052,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752052\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n \"acc_stderr\": 0.0259060870213193,\n \"acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.0259060870213193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.03097543638684542,\n \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.03097543638684542\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846482,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846482\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.031753678460966266,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.031753678460966266\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.0454160944650395,\n \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.0454160944650395\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.02559819368665227,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.02559819368665227\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n \"acc_stderr\": 0.015046301846691814,\n \"acc_norm\": 0.7701149425287356,\n \"acc_norm_stderr\": 0.015046301846691814\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.02572280220089582,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.02572280220089582\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32625698324022345,\n \"acc_stderr\": 0.015680441518889178,\n \"acc_norm\": 0.32625698324022345,\n \"acc_norm_stderr\": 0.015680441518889178\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.027368078243971646,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.027368078243971646\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41134289439374183,\n \"acc_stderr\": 0.012567882673803682,\n \"acc_norm\": 0.41134289439374183,\n \"acc_norm_stderr\": 0.012567882673803682\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5964052287581699,\n \"acc_stderr\": 0.019848280168401147,\n \"acc_norm\": 0.5964052287581699,\n \"acc_norm_stderr\": 0.019848280168401147\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6170984357524522,\n \"mc2_stderr\": 0.015233013125868297\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.01152446695409026\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11675511751326763,\n \"acc_stderr\": 0.0088454681369191\n }\n}\n```", "repo_url": "https://huggingface.co/xriminact/TarsChattyBasev0.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-38-14.654966.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["**/details_harness|winogrande|5_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-38-14.654966.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_38_14.654966", "path": ["results_2024-01-18T13-38-14.654966.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-38-14.654966.parquet"]}]}]} | 2024-01-18T13:41:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.0
Dataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:38:14.654966(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.0\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:38:14.654966(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.0\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsChattyBasev0.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:38:14.654966(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
932eb7d39c1b110b3bc005ae22e7e11481d46ae8 |
# Dataset Card for Evaluation run of OdiaGenAI/odia_llama2_7B_base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OdiaGenAI/odia_llama2_7B_base](https://huggingface.co/OdiaGenAI/odia_llama2_7B_base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OdiaGenAI__odia_llama2_7B_base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:44:42.651990](https://huggingface.co/datasets/open-llm-leaderboard/details_OdiaGenAI__odia_llama2_7B_base/blob/main/results_2024-01-18T13-44-42.651990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4616244645045534,
"acc_stderr": 0.03449845090104848,
"acc_norm": 0.4668248334019628,
"acc_norm_stderr": 0.03527397181864705,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871122,
"mc2": 0.3727196891546005,
"mc2_stderr": 0.013923104623490107
},
"harness|arc:challenge|25": {
"acc": 0.46245733788395904,
"acc_stderr": 0.014570144495075576,
"acc_norm": 0.507679180887372,
"acc_norm_stderr": 0.01460966744089257
},
"harness|hellaswag|10": {
"acc": 0.5637323242381995,
"acc_stderr": 0.004949080334816021,
"acc_norm": 0.7594104760007967,
"acc_norm_stderr": 0.004265678940698866
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776568,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776568
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.028372287797962928,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.028372287797962928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036543,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036543
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03540294377095367,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03540294377095367
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41025641025641024,
"acc_stderr": 0.024939313906940788,
"acc_norm": 0.41025641025641024,
"acc_norm_stderr": 0.024939313906940788
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987053,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987053
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6018348623853211,
"acc_stderr": 0.020987989422654264,
"acc_norm": 0.6018348623853211,
"acc_norm_stderr": 0.020987989422654264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.0316746870682898,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.0316746870682898
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239172,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239172
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.03327283370271344,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.03327283370271344
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041695,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041695
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674064,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674064
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6385696040868455,
"acc_stderr": 0.017179601328900732,
"acc_norm": 0.6385696040868455,
"acc_norm_stderr": 0.017179601328900732
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48265895953757226,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.48265895953757226,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.01440029642922559,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.01440029642922559
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.028320325830105915,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.028320325830105915
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02812163604063989,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02812163604063989
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3259452411994785,
"acc_stderr": 0.011971507294982775,
"acc_norm": 0.3259452411994785,
"acc_norm_stderr": 0.011971507294982775
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.020054269200726463,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.020054269200726463
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5373134328358209,
"acc_stderr": 0.035256751674679745,
"acc_norm": 0.5373134328358209,
"acc_norm_stderr": 0.035256751674679745
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871122,
"mc2": 0.3727196891546005,
"mc2_stderr": 0.013923104623490107
},
"harness|winogrande|5": {
"acc": 0.7079715864246251,
"acc_stderr": 0.012779198491754013
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.009818090723727288
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_OdiaGenAI__odia_llama2_7B_base | [
"region:us"
] | 2024-01-18T13:47:03+00:00 | {"pretty_name": "Evaluation run of OdiaGenAI/odia_llama2_7B_base", "dataset_summary": "Dataset automatically created during the evaluation run of model [OdiaGenAI/odia_llama2_7B_base](https://huggingface.co/OdiaGenAI/odia_llama2_7B_base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OdiaGenAI__odia_llama2_7B_base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:44:42.651990](https://huggingface.co/datasets/open-llm-leaderboard/details_OdiaGenAI__odia_llama2_7B_base/blob/main/results_2024-01-18T13-44-42.651990.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4616244645045534,\n \"acc_stderr\": 0.03449845090104848,\n \"acc_norm\": 0.4668248334019628,\n \"acc_norm_stderr\": 0.03527397181864705,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871122,\n \"mc2\": 0.3727196891546005,\n \"mc2_stderr\": 0.013923104623490107\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.46245733788395904,\n \"acc_stderr\": 0.014570144495075576,\n \"acc_norm\": 0.507679180887372,\n \"acc_norm_stderr\": 0.01460966744089257\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5637323242381995,\n \"acc_stderr\": 0.004949080334816021,\n \"acc_norm\": 0.7594104760007967,\n \"acc_norm_stderr\": 0.004265678940698866\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270658,\n \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270658\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776568,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776568\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n \"acc_stderr\": 0.028372287797962928,\n \"acc_norm\": 0.535483870967742,\n \"acc_norm_stderr\": 0.028372287797962928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036543,\n \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036543\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03540294377095367,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03540294377095367\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048574,\n \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048574\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.41025641025641024,\n \"acc_stderr\": 0.024939313906940788,\n \"acc_norm\": 0.41025641025641024,\n \"acc_norm_stderr\": 0.024939313906940788\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987053,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987053\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6018348623853211,\n \"acc_stderr\": 0.020987989422654264,\n \"acc_norm\": 0.6018348623853211,\n \"acc_norm_stderr\": 0.020987989422654264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.0316746870682898,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.0316746870682898\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239172,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239172\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n \"acc_stderr\": 0.03327283370271344,\n \"acc_norm\": 0.5650224215246636,\n \"acc_norm_stderr\": 0.03327283370271344\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041695,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041695\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n \"acc_stderr\": 0.029745048572674064,\n \"acc_norm\": 0.7094017094017094,\n \"acc_norm_stderr\": 0.029745048572674064\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6385696040868455,\n \"acc_stderr\": 0.017179601328900732,\n \"acc_norm\": 0.6385696040868455,\n \"acc_norm_stderr\": 0.017179601328900732\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.01440029642922559,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.01440029642922559\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n \"acc_stderr\": 0.028320325830105915,\n \"acc_norm\": 0.5369774919614148,\n \"acc_norm_stderr\": 0.028320325830105915\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02812163604063989,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02812163604063989\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3259452411994785,\n \"acc_stderr\": 0.011971507294982775,\n \"acc_norm\": 0.3259452411994785,\n \"acc_norm_stderr\": 0.011971507294982775\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.029624663581159703,\n \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.029624663581159703\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.020054269200726463,\n \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.020054269200726463\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5373134328358209,\n \"acc_stderr\": 0.035256751674679745,\n \"acc_norm\": 0.5373134328358209,\n \"acc_norm_stderr\": 0.035256751674679745\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871122,\n \"mc2\": 0.3727196891546005,\n \"mc2_stderr\": 0.013923104623490107\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7079715864246251,\n \"acc_stderr\": 0.012779198491754013\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \"acc_stderr\": 0.009818090723727288\n }\n}\n```", "repo_url": "https://huggingface.co/OdiaGenAI/odia_llama2_7B_base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-44-42.651990.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["**/details_harness|winogrande|5_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-44-42.651990.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_44_42.651990", "path": ["results_2024-01-18T13-44-42.651990.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-44-42.651990.parquet"]}]}]} | 2024-01-18T13:47:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OdiaGenAI/odia_llama2_7B_base
Dataset automatically created during the evaluation run of model OdiaGenAI/odia_llama2_7B_base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:44:42.651990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of OdiaGenAI/odia_llama2_7B_base\n\n\n\nDataset automatically created during the evaluation run of model OdiaGenAI/odia_llama2_7B_base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:44:42.651990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OdiaGenAI/odia_llama2_7B_base\n\n\n\nDataset automatically created during the evaluation run of model OdiaGenAI/odia_llama2_7B_base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:44:42.651990(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3e2da0659a80bc5d057fb357ccecc370d130f74a |
# Dataset Card for Evaluation run of duoqi/Nanbeige-16B-Base-Llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [duoqi/Nanbeige-16B-Base-Llama](https://huggingface.co/duoqi/Nanbeige-16B-Base-Llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_duoqi__Nanbeige-16B-Base-Llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:45:09.426165](https://huggingface.co/datasets/open-llm-leaderboard/details_duoqi__Nanbeige-16B-Base-Llama/blob/main/results_2024-01-18T13-45-09.426165.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6301701399529482,
"acc_stderr": 0.032386700810762455,
"acc_norm": 0.6349032587315631,
"acc_norm_stderr": 0.03303120442847387,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.42602438122791003,
"mc2_stderr": 0.014551892220412064
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5648464163822525,
"acc_norm_stderr": 0.014487986197186045
},
"harness|hellaswag|10": {
"acc": 0.5865365465046803,
"acc_stderr": 0.004914480534533712,
"acc_norm": 0.7896833300139414,
"acc_norm_stderr": 0.004067006345542834
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887249,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887249
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.039215453124671215,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.039215453124671215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469543,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089551,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089551
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.02747960301053881,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.02747960301053881
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406215,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406215
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646508,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646508
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612924,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612924
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037183,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037183
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892499,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892499
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3016759776536313,
"acc_stderr": 0.015350767572220286,
"acc_norm": 0.3016759776536313,
"acc_norm_stderr": 0.015350767572220286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236834,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236834
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.01272844606766998,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.01272844606766998
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954847,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.42602438122791003,
"mc2_stderr": 0.014551892220412064
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.01204235252617479
},
"harness|gsm8k|5": {
"acc": 0.4700530705079606,
"acc_stderr": 0.013747759685444703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_duoqi__Nanbeige-16B-Base-Llama | [
"region:us"
] | 2024-01-18T13:47:29+00:00 | {"pretty_name": "Evaluation run of duoqi/Nanbeige-16B-Base-Llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [duoqi/Nanbeige-16B-Base-Llama](https://huggingface.co/duoqi/Nanbeige-16B-Base-Llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_duoqi__Nanbeige-16B-Base-Llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:45:09.426165](https://huggingface.co/datasets/open-llm-leaderboard/details_duoqi__Nanbeige-16B-Base-Llama/blob/main/results_2024-01-18T13-45-09.426165.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6301701399529482,\n \"acc_stderr\": 0.032386700810762455,\n \"acc_norm\": 0.6349032587315631,\n \"acc_norm_stderr\": 0.03303120442847387,\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.42602438122791003,\n \"mc2_stderr\": 0.014551892220412064\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n \"acc_norm\": 0.5648464163822525,\n \"acc_norm_stderr\": 0.014487986197186045\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5865365465046803,\n \"acc_stderr\": 0.004914480534533712,\n \"acc_norm\": 0.7896833300139414,\n \"acc_norm_stderr\": 0.004067006345542834\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887249,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887249\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.031778212502369216,\n \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.031778212502369216\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.039215453124671215,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.039215453124671215\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469543,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469543\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.02447224384089551,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.02447224384089551\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.02747960301053881,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.02747960301053881\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406215,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406215\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646508,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646508\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612924,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612924\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037183,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037183\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.01911989279892499,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.01911989279892499\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236834,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236834\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.01272844606766998,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.01272844606766998\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954847,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954847\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.42602438122791003,\n \"mc2_stderr\": 0.014551892220412064\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.01204235252617479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4700530705079606,\n \"acc_stderr\": 0.013747759685444703\n }\n}\n```", "repo_url": "https://huggingface.co/duoqi/Nanbeige-16B-Base-Llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-45-09.426165.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["**/details_harness|winogrande|5_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-45-09.426165.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_45_09.426165", "path": ["results_2024-01-18T13-45-09.426165.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-45-09.426165.parquet"]}]}]} | 2024-01-18T13:47:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of duoqi/Nanbeige-16B-Base-Llama
Dataset automatically created during the evaluation run of model duoqi/Nanbeige-16B-Base-Llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:45:09.426165(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of duoqi/Nanbeige-16B-Base-Llama\n\n\n\nDataset automatically created during the evaluation run of model duoqi/Nanbeige-16B-Base-Llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:45:09.426165(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of duoqi/Nanbeige-16B-Base-Llama\n\n\n\nDataset automatically created during the evaluation run of model duoqi/Nanbeige-16B-Base-Llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:45:09.426165(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2922aae26b47b8918f418d2fc4fedf981da58cc9 |
# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fierysurf/Kan-LLaMA-7B-base](https://huggingface.co/fierysurf/Kan-LLaMA-7B-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:48:16.932348](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base/blob/main/results_2024-01-18T13-48-16.932348.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37263074581051026,
"acc_stderr": 0.0338849247942205,
"acc_norm": 0.3774408949562487,
"acc_norm_stderr": 0.03480722110246682,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3957474692508163,
"mc2_stderr": 0.014345144003847196
},
"harness|arc:challenge|25": {
"acc": 0.4069965870307167,
"acc_stderr": 0.014356399418009128,
"acc_norm": 0.439419795221843,
"acc_norm_stderr": 0.014503747823580127
},
"harness|hellaswag|10": {
"acc": 0.5163314080860386,
"acc_stderr": 0.004987119003151497,
"acc_norm": 0.7075283808006373,
"acc_norm_stderr": 0.004539680764142161
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.02210112878741543,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.02210112878741543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36129032258064514,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.36129032258064514,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380025,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380025
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.03550702465131343,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.03550702465131343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.035944137112724366,
"acc_norm": 0.45595854922279794,
"acc_norm_stderr": 0.035944137112724366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.44220183486238535,
"acc_stderr": 0.021293613207520216,
"acc_norm": 0.44220183486238535,
"acc_norm_stderr": 0.021293613207520216
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02988691054762695,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02988691054762695
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.0345423658538061,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.0345423658538061
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4008438818565401,
"acc_stderr": 0.03190080389473236,
"acc_norm": 0.4008438818565401,
"acc_norm_stderr": 0.03190080389473236
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.44274809160305345,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.44274809160305345,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.045604560863872344,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.045604560863872344
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.37423312883435583,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.37423312883435583,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5427350427350427,
"acc_stderr": 0.03263622596380688,
"acc_norm": 0.5427350427350427,
"acc_norm_stderr": 0.03263622596380688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5019157088122606,
"acc_stderr": 0.017879832259026677,
"acc_norm": 0.5019157088122606,
"acc_norm_stderr": 0.017879832259026677
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02791405551046803,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02791405551046803
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4694533762057878,
"acc_stderr": 0.02834504586484069,
"acc_norm": 0.4694533762057878,
"acc_norm_stderr": 0.02834504586484069
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.02700252103451649,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.02700252103451649
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503796,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30964797913950454,
"acc_stderr": 0.011808598262503321,
"acc_norm": 0.30964797913950454,
"acc_norm_stderr": 0.011808598262503321
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406797,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406797
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32189542483660133,
"acc_stderr": 0.018901015322093085,
"acc_norm": 0.32189542483660133,
"acc_norm_stderr": 0.018901015322093085
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739245,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739245
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.03820042586602966,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.03820042586602966
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3957474692508163,
"mc2_stderr": 0.014345144003847196
},
"harness|winogrande|5": {
"acc": 0.6850828729281768,
"acc_stderr": 0.013054277568469231
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base | [
"region:us"
] | 2024-01-18T13:50:38+00:00 | {"pretty_name": "Evaluation run of fierysurf/Kan-LLaMA-7B-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [fierysurf/Kan-LLaMA-7B-base](https://huggingface.co/fierysurf/Kan-LLaMA-7B-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T13:48:16.932348](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base/blob/main/results_2024-01-18T13-48-16.932348.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37263074581051026,\n \"acc_stderr\": 0.0338849247942205,\n \"acc_norm\": 0.3774408949562487,\n \"acc_norm_stderr\": 0.03480722110246682,\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3957474692508163,\n \"mc2_stderr\": 0.014345144003847196\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4069965870307167,\n \"acc_stderr\": 0.014356399418009128,\n \"acc_norm\": 0.439419795221843,\n \"acc_norm_stderr\": 0.014503747823580127\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5163314080860386,\n \"acc_stderr\": 0.004987119003151497,\n \"acc_norm\": 0.7075283808006373,\n \"acc_norm_stderr\": 0.004539680764142161\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24338624338624337,\n \"acc_stderr\": 0.02210112878741543,\n \"acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.02210112878741543\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36129032258064514,\n \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.36129032258064514,\n \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03825460278380025,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03825460278380025\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4595959595959596,\n \"acc_stderr\": 0.03550702465131343,\n \"acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.03550702465131343\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.035944137112724366,\n \"acc_norm\": 0.45595854922279794,\n \"acc_norm_stderr\": 0.035944137112724366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.44220183486238535,\n \"acc_stderr\": 0.021293613207520216,\n \"acc_norm\": 0.44220183486238535,\n \"acc_norm_stderr\": 0.021293613207520216\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02988691054762695,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02988691054762695\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.0345423658538061,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.0345423658538061\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4008438818565401,\n \"acc_stderr\": 0.03190080389473236,\n \"acc_norm\": 0.4008438818565401,\n \"acc_norm_stderr\": 0.03190080389473236\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.40358744394618834,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.44274809160305345,\n \"acc_stderr\": 0.043564472026650695,\n \"acc_norm\": 0.44274809160305345,\n \"acc_norm_stderr\": 0.043564472026650695\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4793388429752066,\n \"acc_stderr\": 0.045604560863872344,\n \"acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.045604560863872344\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.37423312883435583,\n \"acc_stderr\": 0.03802068102899616,\n \"acc_norm\": 0.37423312883435583,\n \"acc_norm_stderr\": 0.03802068102899616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5427350427350427,\n \"acc_stderr\": 0.03263622596380688,\n \"acc_norm\": 0.5427350427350427,\n \"acc_norm_stderr\": 0.03263622596380688\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5019157088122606,\n \"acc_stderr\": 0.017879832259026677,\n \"acc_norm\": 0.5019157088122606,\n \"acc_norm_stderr\": 0.017879832259026677\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.02632981334194624,\n \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.02632981334194624\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02791405551046803,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02791405551046803\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4694533762057878,\n \"acc_stderr\": 0.02834504586484069,\n \"acc_norm\": 0.4694533762057878,\n \"acc_norm_stderr\": 0.02834504586484069\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.02700252103451649,\n \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.02700252103451649\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503796,\n \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503796\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30964797913950454,\n \"acc_stderr\": 0.011808598262503321,\n \"acc_norm\": 0.30964797913950454,\n \"acc_norm_stderr\": 0.011808598262503321\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406797,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406797\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.32189542483660133,\n \"acc_stderr\": 0.018901015322093085,\n \"acc_norm\": 0.32189542483660133,\n \"acc_norm_stderr\": 0.018901015322093085\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.4090909090909091,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n \"acc_stderr\": 0.03533389234739245,\n \"acc_norm\": 0.5174129353233831,\n \"acc_norm_stderr\": 0.03533389234739245\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.03820042586602966,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.03820042586602966\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3957474692508163,\n \"mc2_stderr\": 0.014345144003847196\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6850828729281768,\n \"acc_stderr\": 0.013054277568469231\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/fierysurf/Kan-LLaMA-7B-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["**/details_harness|winogrande|5_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T13-48-16.932348.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T13_48_16.932348", "path": ["results_2024-01-18T13-48-16.932348.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T13-48-16.932348.parquet"]}]}]} | 2024-01-18T13:51:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-base
Dataset automatically created during the evaluation run of model fierysurf/Kan-LLaMA-7B-base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T13:48:16.932348(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-base\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Kan-LLaMA-7B-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:48:16.932348(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-base\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Kan-LLaMA-7B-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T13:48:16.932348(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8c2337246ed2b7d6409db6fe10425514c68428f2 | # Binarized version of HelpSteer
### Dataset Description
A binarized version of https://huggingface.co/datasets/nvidia/HelpSteer ready for DPO using https://github.com/huggingface/alignment-handbook or similar.
For each unique prompt, we take the best and worst scoring (average of helpfulness and correctness) responses. These are converted into MessagesList format in the 'chosen' and 'rejected' columns.
- **Created by:** [dctanner](https://huggingface.co/dctanner) and the team at [Sablo AI](https://sablo.ai)
- **License:** CC BY 4.0 | sablo/HelpSteer_binarized | [
"language:en",
"license:cc-by-4.0",
"human-feedback",
"region:us"
] | 2024-01-18T14:11:39+00:00 | {"language": ["en"], "license": "cc-by-4.0", "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_chosen", "dtype": "float64"}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_rejected", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 69199364, "num_examples": 8130}, {"name": "test", "num_bytes": 3597313, "num_examples": 418}], "download_size": 42251007, "dataset_size": 72796677}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["human-feedback"]} | 2024-01-18T15:10:03+00:00 | [] | [
"en"
] | TAGS
#language-English #license-cc-by-4.0 #human-feedback #region-us
| # Binarized version of HelpSteer
### Dataset Description
A binarized version of URL ready for DPO using URL or similar.
For each unique prompt, we take the best and worst scoring (average of helpfulness and correctness) responses. These are converted into MessagesList format in the 'chosen' and 'rejected' columns.
- Created by: dctanner and the team at Sablo AI
- License: CC BY 4.0 | [
"# Binarized version of HelpSteer",
"### Dataset Description\n\nA binarized version of URL ready for DPO using URL or similar.\n\nFor each unique prompt, we take the best and worst scoring (average of helpfulness and correctness) responses. These are converted into MessagesList format in the 'chosen' and 'rejected' columns.\n\n- Created by: dctanner and the team at Sablo AI\n- License: CC BY 4.0"
] | [
"TAGS\n#language-English #license-cc-by-4.0 #human-feedback #region-us \n",
"# Binarized version of HelpSteer",
"### Dataset Description\n\nA binarized version of URL ready for DPO using URL or similar.\n\nFor each unique prompt, we take the best and worst scoring (average of helpfulness and correctness) responses. These are converted into MessagesList format in the 'chosen' and 'rejected' columns.\n\n- Created by: dctanner and the team at Sablo AI\n- License: CC BY 4.0"
] |
4d4f986041e15f18cf9660c7a56c3c9b5a00a305 |
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.22
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.22](https://huggingface.co/liminerity/Blur-7b-v1.22) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-v1.22",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T14:27:00.815176](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.22/blob/main/results_2024-01-18T14-27-00.815176.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5792659642890636,
"acc_stderr": 0.033736595862772584,
"acc_norm": 0.5837704661411739,
"acc_norm_stderr": 0.03445011469626218,
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6795713154043607,
"mc2_stderr": 0.01513714146837095
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578276,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000324
},
"harness|hellaswag|10": {
"acc": 0.6389165504879506,
"acc_stderr": 0.004793330525656209,
"acc_norm": 0.8208524198366859,
"acc_norm_stderr": 0.003826921299075399
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342654,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342654
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799602,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799602
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.01812566918086149,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.01812566918086149
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105296,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02441494730454368,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02441494730454368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395962,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570762,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570762
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336393,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336393
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.02667561192603711,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.02667561192603711
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41460234680573665,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.41460234680573665,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6795713154043607,
"mc2_stderr": 0.01513714146837095
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722762
},
"harness|gsm8k|5": {
"acc": 0.310841546626232,
"acc_stderr": 0.012748860507777727
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liminerity__Blur-7b-v1.22 | [
"region:us"
] | 2024-01-18T14:17:27+00:00 | {"pretty_name": "Evaluation run of liminerity/Blur-7b-v1.22", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.22](https://huggingface.co/liminerity/Blur-7b-v1.22) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-v1.22\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T14:27:00.815176](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.22/blob/main/results_2024-01-18T14-27-00.815176.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5792659642890636,\n \"acc_stderr\": 0.033736595862772584,\n \"acc_norm\": 0.5837704661411739,\n \"acc_norm_stderr\": 0.03445011469626218,\n \"mc1\": 0.5128518971848225,\n \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6795713154043607,\n \"mc2_stderr\": 0.01513714146837095\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578276,\n \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000324\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6389165504879506,\n \"acc_stderr\": 0.004793330525656209,\n \"acc_norm\": 0.8208524198366859,\n \"acc_norm_stderr\": 0.003826921299075399\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342654,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342654\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799602,\n \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799602\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7669724770642202,\n \"acc_stderr\": 0.01812566918086149,\n \"acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.01812566918086149\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105296,\n \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02441494730454368,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02441494730454368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n \"acc_stderr\": 0.015464676163395962,\n \"acc_norm\": 0.7509578544061303,\n \"acc_norm_stderr\": 0.015464676163395962\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570762,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570762\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336393,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336393\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.02667561192603711,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.02667561192603711\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41460234680573665,\n \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.41460234680573665,\n \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6795713154043607,\n \"mc2_stderr\": 0.01513714146837095\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722762\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.310841546626232,\n \"acc_stderr\": 0.012748860507777727\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/Blur-7b-v1.22", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-15-07.987352.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["**/details_harness|winogrande|5_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["**/details_harness|winogrande|5_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T14-27-00.815176.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T14_15_07.987352", "path": ["results_2024-01-18T14-15-07.987352.parquet"]}, {"split": "2024_01_18T14_27_00.815176", "path": ["results_2024-01-18T14-27-00.815176.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T14-27-00.815176.parquet"]}]}]} | 2024-01-18T14:29:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.22
Dataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.22 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T14:27:00.815176(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.22\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.22 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:27:00.815176(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.22\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Blur-7b-v1.22 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:27:00.815176(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d0867292d780331bd6591381c7b4270a281d937a |
# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded](https://huggingface.co/fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T14:17:05.941006](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded/blob/main/results_2024-01-18T14-17-05.941006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4090385972648381,
"acc_stderr": 0.0341788866124078,
"acc_norm": 0.41460362597527234,
"acc_norm_stderr": 0.03503821200342104,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4504265715815864,
"mc2_stderr": 0.014995497671563001
},
"harness|arc:challenge|25": {
"acc": 0.4206484641638225,
"acc_stderr": 0.014426211252508403,
"acc_norm": 0.4590443686006826,
"acc_norm_stderr": 0.01456229107360123
},
"harness|hellaswag|10": {
"acc": 0.5291774546903008,
"acc_stderr": 0.004981278326428013,
"acc_norm": 0.714299940250946,
"acc_norm_stderr": 0.004508239594503835
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596239,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.42258064516129035,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.42258064516129035,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5595854922279793,
"acc_stderr": 0.035827245300360945,
"acc_norm": 0.5595854922279793,
"acc_norm_stderr": 0.035827245300360945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37435897435897436,
"acc_stderr": 0.024537591572830513,
"acc_norm": 0.37435897435897436,
"acc_norm_stderr": 0.024537591572830513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.031282177063684594,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.031282177063684594
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4917431192660551,
"acc_stderr": 0.021434399918214338,
"acc_norm": 0.4917431192660551,
"acc_norm_stderr": 0.021434399918214338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560524,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524867,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524867
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4767932489451477,
"acc_stderr": 0.032512152011410174,
"acc_norm": 0.4767932489451477,
"acc_norm_stderr": 0.032512152011410174
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4171779141104294,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.4171779141104294,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5427841634738186,
"acc_stderr": 0.01781438523853444,
"acc_norm": 0.5427841634738186,
"acc_norm_stderr": 0.01781438523853444
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41329479768786126,
"acc_stderr": 0.026511261369409247,
"acc_norm": 0.41329479768786126,
"acc_norm_stderr": 0.026511261369409247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961447,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.028568699752225868,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.028568699752225868
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.41975308641975306,
"acc_stderr": 0.027460099557005138,
"acc_norm": 0.41975308641975306,
"acc_norm_stderr": 0.027460099557005138
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611317,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31681877444589307,
"acc_stderr": 0.011882349954723011,
"acc_norm": 0.31681877444589307,
"acc_norm_stderr": 0.011882349954723011
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.029029422815681393,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.029029422815681393
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.02000791273935936,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.02000791273935936
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739244,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.037117251907407486,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.037117251907407486
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5672514619883041,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.5672514619883041,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4504265715815864,
"mc2_stderr": 0.014995497671563001
},
"harness|winogrande|5": {
"acc": 0.6882399368587214,
"acc_stderr": 0.013018571197638548
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.004302045046564279
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded | [
"region:us"
] | 2024-01-18T14:19:30+00:00 | {"pretty_name": "Evaluation run of fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded", "dataset_summary": "Dataset automatically created during the evaluation run of model [fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded](https://huggingface.co/fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T14:17:05.941006](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded/blob/main/results_2024-01-18T14-17-05.941006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4090385972648381,\n \"acc_stderr\": 0.0341788866124078,\n \"acc_norm\": 0.41460362597527234,\n \"acc_norm_stderr\": 0.03503821200342104,\n \"mc1\": 0.3084455324357405,\n \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4504265715815864,\n \"mc2_stderr\": 0.014995497671563001\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4206484641638225,\n \"acc_stderr\": 0.014426211252508403,\n \"acc_norm\": 0.4590443686006826,\n \"acc_norm_stderr\": 0.01456229107360123\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5291774546903008,\n \"acc_stderr\": 0.004981278326428013,\n \"acc_norm\": 0.714299940250946,\n \"acc_norm_stderr\": 0.004508239594503835\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.3815028901734104,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596239,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596239\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.42258064516129035,\n \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.42258064516129035,\n \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5595854922279793,\n \"acc_stderr\": 0.035827245300360945,\n \"acc_norm\": 0.5595854922279793,\n \"acc_norm_stderr\": 0.035827245300360945\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.37435897435897436,\n \"acc_stderr\": 0.024537591572830513,\n \"acc_norm\": 0.37435897435897436,\n \"acc_norm_stderr\": 0.024537591572830513\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.031282177063684594,\n \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.031282177063684594\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.4917431192660551,\n \"acc_stderr\": 0.021434399918214338,\n \"acc_norm\": 0.4917431192660551,\n \"acc_norm_stderr\": 0.021434399918214338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560524,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560524\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524867,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524867\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4767932489451477,\n \"acc_stderr\": 0.032512152011410174,\n \"acc_norm\": 0.4767932489451477,\n \"acc_norm_stderr\": 0.032512152011410174\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.4260089686098655,\n \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.6196581196581197,\n \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5427841634738186,\n \"acc_stderr\": 0.01781438523853444,\n \"acc_norm\": 0.5427841634738186,\n \"acc_norm_stderr\": 0.01781438523853444\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41329479768786126,\n \"acc_stderr\": 0.026511261369409247,\n \"acc_norm\": 0.41329479768786126,\n \"acc_norm_stderr\": 0.026511261369409247\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961447,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961447\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.028568699752225868,\n \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.028568699752225868\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.5241157556270096,\n \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.41975308641975306,\n \"acc_stderr\": 0.027460099557005138,\n \"acc_norm\": 0.41975308641975306,\n \"acc_norm_stderr\": 0.027460099557005138\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611317,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611317\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31681877444589307,\n \"acc_stderr\": 0.011882349954723011,\n \"acc_norm\": 0.31681877444589307,\n \"acc_norm_stderr\": 0.011882349954723011\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.029029422815681393,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.029029422815681393\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.02000791273935936,\n \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.02000791273935936\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n \"acc_stderr\": 0.03533389234739244,\n \"acc_norm\": 0.5174129353233831,\n \"acc_norm_stderr\": 0.03533389234739244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n \"acc_stderr\": 0.037117251907407486,\n \"acc_norm\": 0.3493975903614458,\n \"acc_norm_stderr\": 0.037117251907407486\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5672514619883041,\n \"acc_stderr\": 0.03799978644370607,\n \"acc_norm\": 0.5672514619883041,\n \"acc_norm_stderr\": 0.03799978644370607\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4504265715815864,\n \"mc2_stderr\": 0.014995497671563001\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6882399368587214,\n \"acc_stderr\": 0.013018571197638548\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \"acc_stderr\": 0.004302045046564279\n }\n}\n```", "repo_url": "https://huggingface.co/fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["**/details_harness|winogrande|5_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T14-17-05.941006.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T14_17_05.941006", "path": ["results_2024-01-18T14-17-05.941006.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T14-17-05.941006.parquet"]}]}]} | 2024-01-18T14:19:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded
Dataset automatically created during the evaluation run of model fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T14:17:05.941006(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:17:05.941006(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:17:05.941006(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
dfc9f9cc60aaf93a4cbbc60a3bb66d1fd9910dfa |
# Dataset Card for Evaluation run of fierysurf/Ambari-7B-base-v0.1-sharded
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fierysurf/Ambari-7B-base-v0.1-sharded](https://huggingface.co/fierysurf/Ambari-7B-base-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T14:24:01.960531](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded/blob/main/results_2024-01-18T14-24-01.960531.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.405829534710468,
"acc_stderr": 0.034221917154898474,
"acc_norm": 0.4109431400494777,
"acc_norm_stderr": 0.03510054355301299,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931576,
"mc2": 0.3891001339071866,
"mc2_stderr": 0.013756179587991524
},
"harness|arc:challenge|25": {
"acc": 0.4462457337883959,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.47952218430034127,
"acc_norm_stderr": 0.014599131353035007
},
"harness|hellaswag|10": {
"acc": 0.5528779127663812,
"acc_stderr": 0.004961799358836434,
"acc_norm": 0.7461661023700458,
"acc_norm_stderr": 0.004343142545094248
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4161290322580645,
"acc_stderr": 0.028040981380761547,
"acc_norm": 0.4161290322580645,
"acc_norm_stderr": 0.028040981380761547
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03902551007374449,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03902551007374449
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5440414507772021,
"acc_stderr": 0.03594413711272437,
"acc_norm": 0.5440414507772021,
"acc_norm_stderr": 0.03594413711272437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844082,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5266055045871559,
"acc_stderr": 0.021406952688151574,
"acc_norm": 0.5266055045871559,
"acc_norm_stderr": 0.021406952688151574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5021097046413502,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.5021097046413502,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3893129770992366,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.3893129770992366,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319772,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319772
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536821,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536821
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4662576687116564,
"acc_stderr": 0.03919415545048411,
"acc_norm": 0.4662576687116564,
"acc_norm_stderr": 0.03919415545048411
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5242718446601942,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.5242718446601942,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.03222414045241108,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.03222414045241108
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5402298850574713,
"acc_stderr": 0.01782199409693354,
"acc_norm": 0.5402298850574713,
"acc_norm_stderr": 0.01782199409693354
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.026538189104705477,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.026538189104705477
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.02827549015679143,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.02827549015679143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.028394421370984538,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.028394421370984538
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.011787910251664587,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.011787910251664587
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.01987380200506118,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.01987380200506118
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3510204081632653,
"acc_stderr": 0.030555316755573644,
"acc_norm": 0.3510204081632653,
"acc_norm_stderr": 0.030555316755573644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.0374005938202932,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.0374005938202932
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5730994152046783,
"acc_stderr": 0.03793620616529916,
"acc_norm": 0.5730994152046783,
"acc_norm_stderr": 0.03793620616529916
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931576,
"mc2": 0.3891001339071866,
"mc2_stderr": 0.013756179587991524
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.012610826539404676
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.003447819272389016
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded | [
"region:us"
] | 2024-01-18T14:26:27+00:00 | {"pretty_name": "Evaluation run of fierysurf/Ambari-7B-base-v0.1-sharded", "dataset_summary": "Dataset automatically created during the evaluation run of model [fierysurf/Ambari-7B-base-v0.1-sharded](https://huggingface.co/fierysurf/Ambari-7B-base-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T14:24:01.960531](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded/blob/main/results_2024-01-18T14-24-01.960531.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.405829534710468,\n \"acc_stderr\": 0.034221917154898474,\n \"acc_norm\": 0.4109431400494777,\n \"acc_norm_stderr\": 0.03510054355301299,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931576,\n \"mc2\": 0.3891001339071866,\n \"mc2_stderr\": 0.013756179587991524\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4462457337883959,\n \"acc_stderr\": 0.014526705548539982,\n \"acc_norm\": 0.47952218430034127,\n \"acc_norm_stderr\": 0.014599131353035007\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5528779127663812,\n \"acc_stderr\": 0.004961799358836434,\n \"acc_norm\": 0.7461661023700458,\n \"acc_norm_stderr\": 0.004343142545094248\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.03032594578928611,\n \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.03032594578928611\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236785,\n \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236785\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4161290322580645,\n \"acc_stderr\": 0.028040981380761547,\n \"acc_norm\": 0.4161290322580645,\n \"acc_norm_stderr\": 0.028040981380761547\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03902551007374449,\n \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03902551007374449\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.03594413711272437,\n \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.03594413711272437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.02466674491518722,\n \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.02466674491518722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.03196876989195778,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03196876989195778\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5266055045871559,\n \"acc_stderr\": 0.021406952688151574,\n \"acc_norm\": 0.5266055045871559,\n \"acc_norm_stderr\": 0.021406952688151574\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5021097046413502,\n \"acc_stderr\": 0.032546938018020076,\n \"acc_norm\": 0.5021097046413502,\n \"acc_norm_stderr\": 0.032546938018020076\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.04812917324536821,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.04812917324536821\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.03919415545048411,\n \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.03919415545048411\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5242718446601942,\n \"acc_stderr\": 0.049449010929737795,\n \"acc_norm\": 0.5242718446601942,\n \"acc_norm_stderr\": 0.049449010929737795\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.03222414045241108,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.03222414045241108\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5402298850574713,\n \"acc_stderr\": 0.01782199409693354,\n \"acc_norm\": 0.5402298850574713,\n \"acc_norm_stderr\": 0.01782199409693354\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.026538189104705477,\n \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.026538189104705477\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.02827549015679143,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.02827549015679143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n \"acc_stderr\": 0.028394421370984538,\n \"acc_norm\": 0.5080385852090032,\n \"acc_norm_stderr\": 0.028394421370984538\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.0277012284685426,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.0277012284685426\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.011787910251664587,\n \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.011787910251664587\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403196,\n \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403196\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4068627450980392,\n \"acc_stderr\": 0.01987380200506118,\n \"acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.01987380200506118\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3510204081632653,\n \"acc_stderr\": 0.030555316755573644,\n \"acc_norm\": 0.3510204081632653,\n \"acc_norm_stderr\": 0.030555316755573644\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n \"acc_stderr\": 0.0374005938202932,\n \"acc_norm\": 0.3614457831325301,\n \"acc_norm_stderr\": 0.0374005938202932\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5730994152046783,\n \"acc_stderr\": 0.03793620616529916,\n \"acc_norm\": 0.5730994152046783,\n \"acc_norm_stderr\": 0.03793620616529916\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931576,\n \"mc2\": 0.3891001339071866,\n \"mc2_stderr\": 0.013756179587991524\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \"acc_stderr\": 0.003447819272389016\n }\n}\n```", "repo_url": "https://huggingface.co/fierysurf/Ambari-7B-base-v0.1-sharded", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["**/details_harness|winogrande|5_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T14-24-01.960531.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T14_24_01.960531", "path": ["results_2024-01-18T14-24-01.960531.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T14-24-01.960531.parquet"]}]}]} | 2024-01-18T14:26:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fierysurf/Ambari-7B-base-v0.1-sharded
Dataset automatically created during the evaluation run of model fierysurf/Ambari-7B-base-v0.1-sharded on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T14:24:01.960531(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fierysurf/Ambari-7B-base-v0.1-sharded\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Ambari-7B-base-v0.1-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:24:01.960531(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fierysurf/Ambari-7B-base-v0.1-sharded\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Ambari-7B-base-v0.1-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:24:01.960531(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0f1318f6688a1ad6f69039febb3805c80beb1b5a |
Source https://universe.roboflow.com/hugo-cuautle-magos/ecg-ialjy | brainer/ecg-ialjy | [
"region:us"
] | 2024-01-18T14:29:25+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "width", "dtype": "int64"}, {"name": "height", "dtype": "int64"}, {"name": "pixel_values", "sequence": {"sequence": {"sequence": "uint8"}}}, {"name": "image_id", "dtype": "int64"}, {"name": "image_path", "dtype": "string"}, {"name": "objects", "struct": [{"name": "area", "sequence": "float64"}, {"name": "bbox", "sequence": {"sequence": "float64"}}, {"name": "category", "sequence": "int64"}, {"name": "id", "sequence": "int64"}]}], "splits": [{"name": "train", "num_bytes": 217663041.0, "num_examples": 73}, {"name": "test", "num_bytes": 29820234.0, "num_examples": 10}, {"name": "valid", "num_bytes": 62616754.0, "num_examples": 21}], "download_size": 80971481, "dataset_size": 310100029.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}]} | 2024-01-19T12:42:52+00:00 | [] | [] | TAGS
#region-us
|
Source URL | [] | [
"TAGS\n#region-us \n"
] |
47d0951834b9032b1a362db68949dca14f4a77a8 | # Dataset Card for "aihub_admin_generated_answers_3question"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | wisenut-nlp-team/aihub_admin_generated_answers_3question | [
"region:us"
] | 2024-01-18T14:30:27+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "context", "sequence": "string"}, {"name": "answer", "sequence": "string"}, {"name": "original_answer", "sequence": "string"}, {"name": "similar_contexts", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 15143160551, "num_examples": 315745}], "download_size": 6996064997, "dataset_size": 15143160551}} | 2024-01-18T14:39:04+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "aihub_admin_generated_answers_3question"
More Information needed | [
"# Dataset Card for \"aihub_admin_generated_answers_3question\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"aihub_admin_generated_answers_3question\"\n\nMore Information needed"
] |
39993193962733c286fa768de4340131d4dbfb7b |
Datasets used in the ClimateGPT paper for Climate Evaluation.
* Github Repo containing Prompt Templates for Climate Evaluation: [https://github.com/eci-io/climategpt-evaluation](https://github.com/eci-io/climategpt-evaluation)
* Paper Link: [https://arxiv.org/abs/2401.09646](https://arxiv.org/abs/2401.09646)
### Citation Information
```
@misc{thulke2024climategpt,
title={ClimateGPT: Towards AI Synthesizing Interdisciplinary Research on Climate Change},
author={David Thulke and Yingbo Gao and Petrus Pelser and Rein Brune and Rricha Jalota and Floris Fok and Michael Ramos and Ian van Wyk and Abdallah Nasir and Hayden Goldstein and Taylor Tragemann and Katie Nguyen and Ariana Fowler and Andrew Stanco and Jon Gabriel and Jordan Taylor and Dean Moro and Evgenii Tsymbalov and Juliette de Waal and Evgeny Matusov and Mudar Yaghi and Mohammad Shihadah and Hermann Ney and Christian Dugast and Jonathan Dotan and Daniel Erasmus},
year={2024},
eprint={2401.09646},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
| eci-io/climate-evaluation | [
"task_categories:text-classification",
"task_categories:multiple-choice",
"language:en",
"arxiv:2401.09646",
"region:us"
] | 2024-01-18T14:32:58+00:00 | {"language": ["en"], "task_categories": ["text-classification", "multiple-choice"], "dataset_info": [{"config_name": "cdp_qa", "features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "0", "1": "1"}}}}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 397518015, "num_examples": 548155}, {"name": "validation", "num_bytes": 58167638, "num_examples": 78876}, {"name": "test", "num_bytes": 66654435, "num_examples": 92652}], "download_size": 512401333, "dataset_size": 522340088}, {"config_name": "climate_eng", "features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "0", "1": "1", "2": "2", "3": "3", "4": "4"}}}}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 625518, "num_examples": 2871}, {"name": "validation", "num_bytes": 78234, "num_examples": 354}, {"name": "test", "num_bytes": 81454, "num_examples": 355}], "download_size": 743756, "dataset_size": 785206}, {"config_name": "climate_stance", "features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": 0, "1": 1, "2": 2}}}}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 625518, "num_examples": 2871}, {"name": "validation", "num_bytes": 78234, "num_examples": 354}, {"name": "test", "num_bytes": 81454, "num_examples": 355}], "download_size": 743756, "dataset_size": 785206}, {"config_name": "climatext", "features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "0", "1": "1"}}}}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 847902, "num_examples": 6000}, {"name": "validation", "num_bytes": 48406, "num_examples": 300}, {"name": "test", "num_bytes": 260912, "num_examples": 1600}], "download_size": 1385322, "dataset_size": 1157220}, {"config_name": "exams", "features": [{"name": "subject", "dtype": "string"}, {"name": "question_stem", "dtype": "string"}, {"name": "choices", "dtype": "string"}, {"name": "answerKey", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "test", "num_bytes": 165711, "num_examples": 484}], "download_size": 157661, "dataset_size": 165711}, {"config_name": "exeter", "features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "0", "1": "1"}}}}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "train", "num_bytes": 7495896, "num_examples": 23436}, {"name": "validation", "num_bytes": 837247, "num_examples": 2605}, {"name": "test", "num_bytes": 1053039, "num_examples": 2904}], "download_size": 9071528, "dataset_size": 9386182}, {"config_name": "translated_exams", "features": [{"name": "subject", "dtype": "string"}, {"name": "question_stem", "dtype": "string"}, {"name": "choices", "dtype": "string"}, {"name": "answerKey", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}, {"name": "idx", "dtype": "int32"}], "splits": [{"name": "test", "num_bytes": 132380, "num_examples": 484}], "download_size": 125236, "dataset_size": 132380}]} | 2024-02-15T11:50:43+00:00 | [
"2401.09646"
] | [
"en"
] | TAGS
#task_categories-text-classification #task_categories-multiple-choice #language-English #arxiv-2401.09646 #region-us
|
Datasets used in the ClimateGPT paper for Climate Evaluation.
* Github Repo containing Prompt Templates for Climate Evaluation: URL
* Paper Link: URL
| [] | [
"TAGS\n#task_categories-text-classification #task_categories-multiple-choice #language-English #arxiv-2401.09646 #region-us \n"
] |
26cd4e4c1b0574b0b2b76a2cdbf46b4b89c8102b |
`apache_beam`と`mwparserfromhell`を用いて通常の[Wikipedia](https://huggingface.co/datasets/wikipedia)の前処理を行ったデータセットです。
本家の前処理時間が長すぎたため、前処理済みバージョンを配布しています。
各種規約・ライセンスは本家Wikipediaにしたがってください。
## Preprocessing Code
```python
import datasets as ds
dataset = ds.load_dataset(
"wikipedia",
language="ja",
date="20240101",
beam_runner="DirectRunner",
trust_remote_code=True,
)
dataset.push_to_hub("hpprc/wikipedia-20240101", max_shard_size="1GB")
``` | hpprc/wikipedia-20240101 | [
"language:ja",
"license:cc-by-sa-3.0",
"region:us"
] | 2024-01-18T14:33:06+00:00 | {"language": ["ja"], "license": "cc-by-sa-3.0", "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7088384480, "num_examples": 1395760}], "download_size": 3968649901, "dataset_size": 7088384480}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-22T07:41:53+00:00 | [] | [
"ja"
] | TAGS
#language-Japanese #license-cc-by-sa-3.0 #region-us
|
'apache_beam'と'mwparserfromhell'を用いて通常のWikipediaの前処理を行ったデータセットです。
本家の前処理時間が長すぎたため、前処理済みバージョンを配布しています。
各種規約・ライセンスは本家Wikipediaにしたがってください。
## Preprocessing Code
| [
"## Preprocessing Code"
] | [
"TAGS\n#language-Japanese #license-cc-by-sa-3.0 #region-us \n",
"## Preprocessing Code"
] |
89884bd466d5598e681832015999a0d85905ff6d |
# Dataset Card for Evaluation run of fierysurf/Ambari-7B-Instruct-v0.1-sharded
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fierysurf/Ambari-7B-Instruct-v0.1-sharded](https://huggingface.co/fierysurf/Ambari-7B-Instruct-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fierysurf__Ambari-7B-Instruct-v0.1-sharded",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T14:37:24.844010](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-Instruct-v0.1-sharded/blob/main/results_2024-01-18T14-37-24.844010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3838329966166915,
"acc_stderr": 0.03388122362619041,
"acc_norm": 0.388551088671872,
"acc_norm_stderr": 0.034737127539897106,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.4039249872200656,
"mc2_stderr": 0.014252904541596906
},
"harness|arc:challenge|25": {
"acc": 0.46331058020477817,
"acc_stderr": 0.01457200052775699,
"acc_norm": 0.5,
"acc_norm_stderr": 0.014611390804670088
},
"harness|hellaswag|10": {
"acc": 0.5575582553276239,
"acc_stderr": 0.004956609327218403,
"acc_norm": 0.7458673571001793,
"acc_norm_stderr": 0.004344827546976545
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.030365050829115205,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.030365050829115205
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3870967741935484,
"acc_stderr": 0.02770935967503249,
"acc_norm": 0.3870967741935484,
"acc_norm_stderr": 0.02770935967503249
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.03904272341431856,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.03904272341431856
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.51010101010101,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.51010101010101,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5233160621761658,
"acc_stderr": 0.036045136724422014,
"acc_norm": 0.5233160621761658,
"acc_norm_stderr": 0.036045136724422014
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37435897435897436,
"acc_stderr": 0.024537591572830513,
"acc_norm": 0.37435897435897436,
"acc_norm_stderr": 0.024537591572830513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3865546218487395,
"acc_stderr": 0.031631458075523804,
"acc_norm": 0.3865546218487395,
"acc_norm_stderr": 0.031631458075523804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.47889908256880737,
"acc_stderr": 0.021418224754264643,
"acc_norm": 0.47889908256880737,
"acc_norm_stderr": 0.021418224754264643
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.029886910547626964,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.029886910547626964
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4362745098039216,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.4362745098039216,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.45569620253164556,
"acc_stderr": 0.03241920684693333,
"acc_norm": 0.45569620253164556,
"acc_norm_stderr": 0.03241920684693333
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.04537935177947879,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.04537935177947879
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261836,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258975,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03255326307272487,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03255326307272487
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5057471264367817,
"acc_stderr": 0.01787878232612923,
"acc_norm": 0.5057471264367817,
"acc_norm_stderr": 0.01787878232612923
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925293,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.369281045751634,
"acc_stderr": 0.027634176689602656,
"acc_norm": 0.369281045751634,
"acc_norm_stderr": 0.027634176689602656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.44694533762057875,
"acc_stderr": 0.028237769422085335,
"acc_norm": 0.44694533762057875,
"acc_norm_stderr": 0.028237769422085335
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.38271604938271603,
"acc_stderr": 0.027044538138402616,
"acc_norm": 0.38271604938271603,
"acc_norm_stderr": 0.027044538138402616
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169927,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169927
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3057366362451108,
"acc_stderr": 0.011766973847072915,
"acc_norm": 0.3057366362451108,
"acc_norm_stderr": 0.011766973847072915
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.380718954248366,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.380718954248366,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.30612244897959184,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.30612244897959184,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.46766169154228854,
"acc_stderr": 0.035281314729336065,
"acc_norm": 0.46766169154228854,
"acc_norm_stderr": 0.035281314729336065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5497076023391813,
"acc_stderr": 0.038158273659132366,
"acc_norm": 0.5497076023391813,
"acc_norm_stderr": 0.038158273659132366
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.4039249872200656,
"mc2_stderr": 0.014252904541596906
},
"harness|winogrande|5": {
"acc": 0.6953433307024467,
"acc_stderr": 0.012935646499325312
},
"harness|gsm8k|5": {
"acc": 0.018953752843062926,
"acc_stderr": 0.0037560783410314712
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fierysurf__Ambari-7B-Instruct-v0.1-sharded | [
"region:us"
] | 2024-01-18T14:39:45+00:00 | {"pretty_name": "Evaluation run of fierysurf/Ambari-7B-Instruct-v0.1-sharded", "dataset_summary": "Dataset automatically created during the evaluation run of model [fierysurf/Ambari-7B-Instruct-v0.1-sharded](https://huggingface.co/fierysurf/Ambari-7B-Instruct-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fierysurf__Ambari-7B-Instruct-v0.1-sharded\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T14:37:24.844010](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-Instruct-v0.1-sharded/blob/main/results_2024-01-18T14-37-24.844010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3838329966166915,\n \"acc_stderr\": 0.03388122362619041,\n \"acc_norm\": 0.388551088671872,\n \"acc_norm_stderr\": 0.034737127539897106,\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.4039249872200656,\n \"mc2_stderr\": 0.014252904541596906\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.46331058020477817,\n \"acc_stderr\": 0.01457200052775699,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.014611390804670088\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5575582553276239,\n \"acc_stderr\": 0.004956609327218403,\n \"acc_norm\": 0.7458673571001793,\n \"acc_norm_stderr\": 0.004344827546976545\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.030365050829115205,\n \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.030365050829115205\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.3179190751445087,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3870967741935484,\n \"acc_stderr\": 0.02770935967503249,\n \"acc_norm\": 0.3870967741935484,\n \"acc_norm_stderr\": 0.02770935967503249\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.03904272341431856,\n \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.03904272341431856\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.51010101010101,\n \"acc_stderr\": 0.035616254886737454,\n \"acc_norm\": 0.51010101010101,\n \"acc_norm_stderr\": 0.035616254886737454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.036045136724422014,\n \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.036045136724422014\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.37435897435897436,\n \"acc_stderr\": 0.024537591572830513,\n \"acc_norm\": 0.37435897435897436,\n \"acc_norm_stderr\": 0.024537591572830513\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3865546218487395,\n \"acc_stderr\": 0.031631458075523804,\n \"acc_norm\": 0.3865546218487395,\n \"acc_norm_stderr\": 0.031631458075523804\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.47889908256880737,\n \"acc_stderr\": 0.021418224754264643,\n \"acc_norm\": 0.47889908256880737,\n \"acc_norm_stderr\": 0.021418224754264643\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.029886910547626964,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.029886910547626964\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4362745098039216,\n \"acc_stderr\": 0.03480693138457039,\n \"acc_norm\": 0.4362745098039216,\n \"acc_norm_stderr\": 0.03480693138457039\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.45569620253164556,\n \"acc_stderr\": 0.03241920684693333,\n \"acc_norm\": 0.45569620253164556,\n \"acc_norm_stderr\": 0.03241920684693333\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.5022421524663677,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009225,\n \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009225\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.04537935177947879,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.04537935177947879\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261836,\n \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258975,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258975\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03255326307272487,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03255326307272487\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5057471264367817,\n \"acc_stderr\": 0.01787878232612923,\n \"acc_norm\": 0.5057471264367817,\n \"acc_norm_stderr\": 0.01787878232612923\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546545,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546545\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.369281045751634,\n \"acc_stderr\": 0.027634176689602656,\n \"acc_norm\": 0.369281045751634,\n \"acc_norm_stderr\": 0.027634176689602656\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.44694533762057875,\n \"acc_stderr\": 0.028237769422085335,\n \"acc_norm\": 0.44694533762057875,\n \"acc_norm_stderr\": 0.028237769422085335\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.38271604938271603,\n \"acc_stderr\": 0.027044538138402616,\n \"acc_norm\": 0.38271604938271603,\n \"acc_norm_stderr\": 0.027044538138402616\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3057366362451108,\n \"acc_stderr\": 0.011766973847072915,\n \"acc_norm\": 0.3057366362451108,\n \"acc_norm_stderr\": 0.011766973847072915\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.380718954248366,\n \"acc_stderr\": 0.019643801557924803,\n \"acc_norm\": 0.380718954248366,\n \"acc_norm_stderr\": 0.019643801557924803\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.30612244897959184,\n \"acc_stderr\": 0.02950489645459596,\n \"acc_norm\": 0.30612244897959184,\n \"acc_norm_stderr\": 0.02950489645459596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.46766169154228854,\n \"acc_stderr\": 0.035281314729336065,\n \"acc_norm\": 0.46766169154228854,\n \"acc_norm_stderr\": 0.035281314729336065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5497076023391813,\n \"acc_stderr\": 0.038158273659132366,\n \"acc_norm\": 0.5497076023391813,\n \"acc_norm_stderr\": 0.038158273659132366\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.4039249872200656,\n \"mc2_stderr\": 0.014252904541596906\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6953433307024467,\n \"acc_stderr\": 0.012935646499325312\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.018953752843062926,\n \"acc_stderr\": 0.0037560783410314712\n }\n}\n```", "repo_url": "https://huggingface.co/fierysurf/Ambari-7B-Instruct-v0.1-sharded", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T14-37-24.844010.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["**/details_harness|winogrande|5_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T14-37-24.844010.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T14_37_24.844010", "path": ["results_2024-01-18T14-37-24.844010.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T14-37-24.844010.parquet"]}]}]} | 2024-01-18T14:40:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of fierysurf/Ambari-7B-Instruct-v0.1-sharded
Dataset automatically created during the evaluation run of model fierysurf/Ambari-7B-Instruct-v0.1-sharded on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T14:37:24.844010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of fierysurf/Ambari-7B-Instruct-v0.1-sharded\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Ambari-7B-Instruct-v0.1-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:37:24.844010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fierysurf/Ambari-7B-Instruct-v0.1-sharded\n\n\n\nDataset automatically created during the evaluation run of model fierysurf/Ambari-7B-Instruct-v0.1-sharded on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T14:37:24.844010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
81cfc8d08b3707084c108c7a2b32de71ecfddc87 | # Dataset Card for "compositionality-subsample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mehdidc/compositionality-subsample | [
"region:us"
] | 2024-01-18T14:42:06+00:00 | {"dataset_info": {"features": [{"name": "caption", "dtype": "string"}, {"name": "caption_source", "dtype": "string"}, {"name": "image_0_url", "dtype": "string"}, {"name": "image_1_url", "dtype": "string"}, {"name": "label_0", "dtype": "float64"}, {"name": "label_1", "dtype": "float64"}, {"name": "num_example_per_prompt", "dtype": "int64"}, {"name": "model_0", "dtype": "string"}, {"name": "model_1", "dtype": "string"}, {"name": "jpg_0", "dtype": "binary"}, {"name": "jpg_1", "dtype": "binary"}, {"name": "are_different", "dtype": "bool"}, {"name": "has_label", "dtype": "bool"}, {"name": "origin", "dtype": "string"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 40729530.0, "num_examples": 200}, {"name": "validation", "num_bytes": 43656367, "num_examples": 200}, {"name": "test", "num_bytes": 33184629, "num_examples": 103}], "download_size": 110145866, "dataset_size": 117570526.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-20T13:25:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "compositionality-subsample"
More Information needed | [
"# Dataset Card for \"compositionality-subsample\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"compositionality-subsample\"\n\nMore Information needed"
] |
ab6dd12a8c6b136376c57386e94a814bb13605c5 |
# VegAnn Dataset
### **Vegetation Annotation of a large multi-crop RGB Dataset acquired under diverse conditions for image semantic segmentation**
## Keypoints ⏳
- VegAnn contains 3775 images
- Images are 512*512 pixels
- Corresponding binary masks is 0 for soil + crop residues (background) 255 for Vegetation (foreground)
- The dataset includes images of 26+ crop species, which are not evenly represented
- VegAnn was compiled using a variety of outdoor images captured with different acquisition systems and configurations
- For more information about VegAnn, details, labeling rules and potential uses see https://doi.org/10.1038/s41597-023-02098-y
## Dataset Description 📚
VegAnn, short for Vegetation Annotation, is a meticulously curated collection of 3,775 multi-crop RGB images aimed at enhancing research in crop vegetation segmentation. These images span various phenological stages and were captured using diverse systems and platforms under a wide range of illumination conditions. By aggregating sub-datasets from different projects and institutions, VegAnn represents a broad spectrum of measurement conditions, crop species, and development stages.
### Languages 🌐
The annotations and documentation are primarily in English.
## Dataset Structure 🏗
### Data Instances 📸
A VegAnn data instance consists of a 512x512 pixel RGB image patch derived from larger raw images. These patches are designed to provide sufficient detail for distinguishing between vegetation and background, crucial for applications in semantic segmentation and other forms of computer vision analysis in agricultural contexts.

### Data Fields 📋
- `Name`: Unique identifier for each image patch.
- `System`: The imaging system used to acquire the photo (e.g., Handheld Cameras, DHP, UAV).
- `Orientation`: The camera's orientation during image capture (e.g., Nadir, 45 degrees).
- `latitude` and `longitude`: Geographic coordinates where the image was taken.
- `date`: Date of image acquisition.
- `LocAcc`: Location accuracy flag (1 for high accuracy, 0 for low or uncertain accuracy).
- `Species`: The crop species featured in the image (e.g., Wheat, Maize, Soybean).
- `Owner`: The institution or entity that provided the image (e.g., Arvalis, INRAe).
- `Dataset-Name`: The sub-dataset or project from which the image originates (e.g., Phenomobile, Easypcc).
- `TVT-split1` to `TVT-split5`: Fields indicating the train/validation/test split configurations, facilitating various experimental setups.
### Data Splits 📊
The dataset is structured into multiple splits (as indicated by `TVT-split` fields) to support different training, validation, and testing scenarios in machine learning workflows.
## Dataset Creation 🛠
### Curation Rationale 🤔
The VegAnn dataset was developed to address the gap in available datasets for training convolutional neural networks (CNNs) for the task of semantic segmentation in real-world agricultural environments. By incorporating images from a wide array of conditions and stages of crop development, VegAnn aims to enhance the performance of segmentation algorithms, promote benchmarking, and foster research on large-scale crop vegetation segmentation.
### Source Data 🌱
#### Initial Data Collection and Normalization
Images within VegAnn were sourced from various sub-datasets contributed by different institutions, each under specific acquisition configurations. These were then standardized into 512x512 pixel patches to maintain consistency across the dataset.
#### Who are the source data providers?
The data was provided by a collaboration of institutions including Arvalis, INRAe, The University of Tokyo, University of Queensland, NEON, and EOLAB, among others.

### Annotations 📝
#### Annotation process
Annotations for the dataset were focused on distinguishing between vegetation and background within the images. The process ensured that the images offered sufficient spatial resolution to allow for accurate visual segmentation.
#### Who are the annotators?
The annotations were performed by a team comprising researchers and domain experts from the contributing institutions.
## Considerations for Using the Data 🤓
### Social Impact of Dataset 🌍
The VegAnn dataset is expected to significantly impact agricultural research and commercial applications by enhancing the accuracy of crop monitoring, disease detection, and yield estimation through improved vegetation segmentation techniques.
### Discussion of Biases 🧐
Given the diverse sources of the images, there may be inherent biases towards certain crop types, geographical locations, and imaging conditions. Users should consider this diversity in applications and analyses.
### Licensing Information 📄
Please refer to the specific licensing agreements of the contributing institutions or contact the dataset providers for more information on usage rights and restrictions.
## Citation Information 📚
If you use the VegAnn dataset in your research, please cite the following:
```
@article{madec_vegann_2023,
title = {{VegAnn}, {Vegetation} {Annotation} of multi-crop {RGB} images acquired under diverse conditions for segmentation},
volume = {10},
issn = {2052-4463},
url = {https://doi.org/10.1038/s41597-023-02098-y},
doi = {10.1038/s41597-023-02098-y},
abstract = {Applying deep learning to images of cropping systems provides new knowledge and insights in research and commercial applications. Semantic segmentation or pixel-wise classification, of RGB images acquired at the ground level, into vegetation and background is a critical step in the estimation of several canopy traits. Current state of the art methodologies based on convolutional neural networks (CNNs) are trained on datasets acquired under controlled or indoor environments. These models are unable to generalize to real-world images and hence need to be fine-tuned using new labelled datasets. This motivated the creation of the VegAnn - Vegetation Annotation - dataset, a collection of 3775 multi-crop RGB images acquired for different phenological stages using different systems and platforms in diverse illumination conditions. We anticipate that VegAnn will help improving segmentation algorithm performances, facilitate benchmarking and promote large-scale crop vegetation segmentation research.},
number = {1},
journal = {Scientific Data},
author = {Madec, Simon and Irfan, Kamran and Velumani, Kaaviya and Baret, Frederic and David, Etienne and Daubige, Gaetan and Samatan, Lucas Bernigaud and Serouart, Mario and Smith, Daniel and James, Chrisbin and Camacho, Fernando and Guo, Wei and De Solan, Benoit and Chapman, Scott C. and Weiss, Marie},
month = may,
year = {2023},
pages = {302},
}
```
## Additional Information
- **Dataset Curators**: Simon Madec et al.
- **Version**: 1.0
- **License**: CC-BY
- **Contact**: [email protected] | simonMadec/VegAnn | [
"task_categories:image-segmentation",
"size_categories:1K<n<10K",
"language:en",
"vegetation",
"segmentation",
"region:us"
] | 2024-01-18T14:46:24+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["image-segmentation"], "tags": ["vegetation", "segmentation"], "DOI": ["10.1038/s41597-023-02098-y"], "licence": ["CC-BY"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "mask", "dtype": "image"}, {"name": "System", "dtype": "string"}, {"name": "Orientation", "dtype": "string"}, {"name": "latitude", "dtype": "float64"}, {"name": "longitude", "dtype": "float64"}, {"name": "date", "dtype": "string"}, {"name": "LocAcc", "dtype": "int64"}, {"name": "Species", "dtype": "string"}, {"name": "Owner", "dtype": "string"}, {"name": "Dataset-Name", "dtype": "string"}, {"name": "TVT-split1", "dtype": "string"}, {"name": "TVT-split2", "dtype": "string"}, {"name": "TVT-split3", "dtype": "string"}, {"name": "TVT-split4", "dtype": "string"}, {"name": "TVT-split5", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1896819757.9, "num_examples": 3775}], "download_size": 1940313757, "dataset_size": 1896819757.9}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-10T08:59:52+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-segmentation #size_categories-1K<n<10K #language-English #vegetation #segmentation #region-us
|
# VegAnn Dataset
### Vegetation Annotation of a large multi-crop RGB Dataset acquired under diverse conditions for image semantic segmentation
## Keypoints ⏳
- VegAnn contains 3775 images
- Images are 512*512 pixels
- Corresponding binary masks is 0 for soil + crop residues (background) 255 for Vegetation (foreground)
- The dataset includes images of 26+ crop species, which are not evenly represented
- VegAnn was compiled using a variety of outdoor images captured with different acquisition systems and configurations
- For more information about VegAnn, details, labeling rules and potential uses see URL
## Dataset Description
VegAnn, short for Vegetation Annotation, is a meticulously curated collection of 3,775 multi-crop RGB images aimed at enhancing research in crop vegetation segmentation. These images span various phenological stages and were captured using diverse systems and platforms under a wide range of illumination conditions. By aggregating sub-datasets from different projects and institutions, VegAnn represents a broad spectrum of measurement conditions, crop species, and development stages.
### Languages
The annotations and documentation are primarily in English.
## Dataset Structure
### Data Instances
A VegAnn data instance consists of a 512x512 pixel RGB image patch derived from larger raw images. These patches are designed to provide sufficient detail for distinguishing between vegetation and background, crucial for applications in semantic segmentation and other forms of computer vision analysis in agricultural contexts.
!image/png
### Data Fields
- 'Name': Unique identifier for each image patch.
- 'System': The imaging system used to acquire the photo (e.g., Handheld Cameras, DHP, UAV).
- 'Orientation': The camera's orientation during image capture (e.g., Nadir, 45 degrees).
- 'latitude' and 'longitude': Geographic coordinates where the image was taken.
- 'date': Date of image acquisition.
- 'LocAcc': Location accuracy flag (1 for high accuracy, 0 for low or uncertain accuracy).
- 'Species': The crop species featured in the image (e.g., Wheat, Maize, Soybean).
- 'Owner': The institution or entity that provided the image (e.g., Arvalis, INRAe).
- 'Dataset-Name': The sub-dataset or project from which the image originates (e.g., Phenomobile, Easypcc).
- 'TVT-split1' to 'TVT-split5': Fields indicating the train/validation/test split configurations, facilitating various experimental setups.
### Data Splits
The dataset is structured into multiple splits (as indicated by 'TVT-split' fields) to support different training, validation, and testing scenarios in machine learning workflows.
## Dataset Creation
### Curation Rationale
The VegAnn dataset was developed to address the gap in available datasets for training convolutional neural networks (CNNs) for the task of semantic segmentation in real-world agricultural environments. By incorporating images from a wide array of conditions and stages of crop development, VegAnn aims to enhance the performance of segmentation algorithms, promote benchmarking, and foster research on large-scale crop vegetation segmentation.
### Source Data
#### Initial Data Collection and Normalization
Images within VegAnn were sourced from various sub-datasets contributed by different institutions, each under specific acquisition configurations. These were then standardized into 512x512 pixel patches to maintain consistency across the dataset.
#### Who are the source data providers?
The data was provided by a collaboration of institutions including Arvalis, INRAe, The University of Tokyo, University of Queensland, NEON, and EOLAB, among others.
!image/png
### Annotations
#### Annotation process
Annotations for the dataset were focused on distinguishing between vegetation and background within the images. The process ensured that the images offered sufficient spatial resolution to allow for accurate visual segmentation.
#### Who are the annotators?
The annotations were performed by a team comprising researchers and domain experts from the contributing institutions.
## Considerations for Using the Data
### Social Impact of Dataset
The VegAnn dataset is expected to significantly impact agricultural research and commercial applications by enhancing the accuracy of crop monitoring, disease detection, and yield estimation through improved vegetation segmentation techniques.
### Discussion of Biases
Given the diverse sources of the images, there may be inherent biases towards certain crop types, geographical locations, and imaging conditions. Users should consider this diversity in applications and analyses.
### Licensing Information
Please refer to the specific licensing agreements of the contributing institutions or contact the dataset providers for more information on usage rights and restrictions.
If you use the VegAnn dataset in your research, please cite the following:
## Additional Information
- Dataset Curators: Simon Madec et al.
- Version: 1.0
- License: CC-BY
- Contact: URL@URL | [
"# VegAnn Dataset",
"### Vegetation Annotation of a large multi-crop RGB Dataset acquired under diverse conditions for image semantic segmentation",
"## Keypoints ⏳\n\n- VegAnn contains 3775 images \n- Images are 512*512 pixels \n- Corresponding binary masks is 0 for soil + crop residues (background) 255 for Vegetation (foreground)\n- The dataset includes images of 26+ crop species, which are not evenly represented\n- VegAnn was compiled using a variety of outdoor images captured with different acquisition systems and configurations\n- For more information about VegAnn, details, labeling rules and potential uses see URL",
"## Dataset Description \n\nVegAnn, short for Vegetation Annotation, is a meticulously curated collection of 3,775 multi-crop RGB images aimed at enhancing research in crop vegetation segmentation. These images span various phenological stages and were captured using diverse systems and platforms under a wide range of illumination conditions. By aggregating sub-datasets from different projects and institutions, VegAnn represents a broad spectrum of measurement conditions, crop species, and development stages.",
"### Languages \n\nThe annotations and documentation are primarily in English.",
"## Dataset Structure",
"### Data Instances \n\nA VegAnn data instance consists of a 512x512 pixel RGB image patch derived from larger raw images. These patches are designed to provide sufficient detail for distinguishing between vegetation and background, crucial for applications in semantic segmentation and other forms of computer vision analysis in agricultural contexts.\n\n\n!image/png",
"### Data Fields \n\n- 'Name': Unique identifier for each image patch.\n- 'System': The imaging system used to acquire the photo (e.g., Handheld Cameras, DHP, UAV).\n- 'Orientation': The camera's orientation during image capture (e.g., Nadir, 45 degrees).\n- 'latitude' and 'longitude': Geographic coordinates where the image was taken.\n- 'date': Date of image acquisition.\n- 'LocAcc': Location accuracy flag (1 for high accuracy, 0 for low or uncertain accuracy).\n- 'Species': The crop species featured in the image (e.g., Wheat, Maize, Soybean).\n- 'Owner': The institution or entity that provided the image (e.g., Arvalis, INRAe).\n- 'Dataset-Name': The sub-dataset or project from which the image originates (e.g., Phenomobile, Easypcc).\n- 'TVT-split1' to 'TVT-split5': Fields indicating the train/validation/test split configurations, facilitating various experimental setups.",
"### Data Splits \n\nThe dataset is structured into multiple splits (as indicated by 'TVT-split' fields) to support different training, validation, and testing scenarios in machine learning workflows.",
"## Dataset Creation",
"### Curation Rationale \n\nThe VegAnn dataset was developed to address the gap in available datasets for training convolutional neural networks (CNNs) for the task of semantic segmentation in real-world agricultural environments. By incorporating images from a wide array of conditions and stages of crop development, VegAnn aims to enhance the performance of segmentation algorithms, promote benchmarking, and foster research on large-scale crop vegetation segmentation.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nImages within VegAnn were sourced from various sub-datasets contributed by different institutions, each under specific acquisition configurations. These were then standardized into 512x512 pixel patches to maintain consistency across the dataset.",
"#### Who are the source data providers?\n\nThe data was provided by a collaboration of institutions including Arvalis, INRAe, The University of Tokyo, University of Queensland, NEON, and EOLAB, among others.\n\n\n!image/png",
"### Annotations",
"#### Annotation process\n\nAnnotations for the dataset were focused on distinguishing between vegetation and background within the images. The process ensured that the images offered sufficient spatial resolution to allow for accurate visual segmentation.",
"#### Who are the annotators?\n\nThe annotations were performed by a team comprising researchers and domain experts from the contributing institutions.",
"## Considerations for Using the Data",
"### Social Impact of Dataset \n\nThe VegAnn dataset is expected to significantly impact agricultural research and commercial applications by enhancing the accuracy of crop monitoring, disease detection, and yield estimation through improved vegetation segmentation techniques.",
"### Discussion of Biases \n\nGiven the diverse sources of the images, there may be inherent biases towards certain crop types, geographical locations, and imaging conditions. Users should consider this diversity in applications and analyses.",
"### Licensing Information \n\nPlease refer to the specific licensing agreements of the contributing institutions or contact the dataset providers for more information on usage rights and restrictions.\n\n \n\nIf you use the VegAnn dataset in your research, please cite the following:",
"## Additional Information\n\n- Dataset Curators: Simon Madec et al.\n- Version: 1.0\n- License: CC-BY\n- Contact: URL@URL"
] | [
"TAGS\n#task_categories-image-segmentation #size_categories-1K<n<10K #language-English #vegetation #segmentation #region-us \n",
"# VegAnn Dataset",
"### Vegetation Annotation of a large multi-crop RGB Dataset acquired under diverse conditions for image semantic segmentation",
"## Keypoints ⏳\n\n- VegAnn contains 3775 images \n- Images are 512*512 pixels \n- Corresponding binary masks is 0 for soil + crop residues (background) 255 for Vegetation (foreground)\n- The dataset includes images of 26+ crop species, which are not evenly represented\n- VegAnn was compiled using a variety of outdoor images captured with different acquisition systems and configurations\n- For more information about VegAnn, details, labeling rules and potential uses see URL",
"## Dataset Description \n\nVegAnn, short for Vegetation Annotation, is a meticulously curated collection of 3,775 multi-crop RGB images aimed at enhancing research in crop vegetation segmentation. These images span various phenological stages and were captured using diverse systems and platforms under a wide range of illumination conditions. By aggregating sub-datasets from different projects and institutions, VegAnn represents a broad spectrum of measurement conditions, crop species, and development stages.",
"### Languages \n\nThe annotations and documentation are primarily in English.",
"## Dataset Structure",
"### Data Instances \n\nA VegAnn data instance consists of a 512x512 pixel RGB image patch derived from larger raw images. These patches are designed to provide sufficient detail for distinguishing between vegetation and background, crucial for applications in semantic segmentation and other forms of computer vision analysis in agricultural contexts.\n\n\n!image/png",
"### Data Fields \n\n- 'Name': Unique identifier for each image patch.\n- 'System': The imaging system used to acquire the photo (e.g., Handheld Cameras, DHP, UAV).\n- 'Orientation': The camera's orientation during image capture (e.g., Nadir, 45 degrees).\n- 'latitude' and 'longitude': Geographic coordinates where the image was taken.\n- 'date': Date of image acquisition.\n- 'LocAcc': Location accuracy flag (1 for high accuracy, 0 for low or uncertain accuracy).\n- 'Species': The crop species featured in the image (e.g., Wheat, Maize, Soybean).\n- 'Owner': The institution or entity that provided the image (e.g., Arvalis, INRAe).\n- 'Dataset-Name': The sub-dataset or project from which the image originates (e.g., Phenomobile, Easypcc).\n- 'TVT-split1' to 'TVT-split5': Fields indicating the train/validation/test split configurations, facilitating various experimental setups.",
"### Data Splits \n\nThe dataset is structured into multiple splits (as indicated by 'TVT-split' fields) to support different training, validation, and testing scenarios in machine learning workflows.",
"## Dataset Creation",
"### Curation Rationale \n\nThe VegAnn dataset was developed to address the gap in available datasets for training convolutional neural networks (CNNs) for the task of semantic segmentation in real-world agricultural environments. By incorporating images from a wide array of conditions and stages of crop development, VegAnn aims to enhance the performance of segmentation algorithms, promote benchmarking, and foster research on large-scale crop vegetation segmentation.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nImages within VegAnn were sourced from various sub-datasets contributed by different institutions, each under specific acquisition configurations. These were then standardized into 512x512 pixel patches to maintain consistency across the dataset.",
"#### Who are the source data providers?\n\nThe data was provided by a collaboration of institutions including Arvalis, INRAe, The University of Tokyo, University of Queensland, NEON, and EOLAB, among others.\n\n\n!image/png",
"### Annotations",
"#### Annotation process\n\nAnnotations for the dataset were focused on distinguishing between vegetation and background within the images. The process ensured that the images offered sufficient spatial resolution to allow for accurate visual segmentation.",
"#### Who are the annotators?\n\nThe annotations were performed by a team comprising researchers and domain experts from the contributing institutions.",
"## Considerations for Using the Data",
"### Social Impact of Dataset \n\nThe VegAnn dataset is expected to significantly impact agricultural research and commercial applications by enhancing the accuracy of crop monitoring, disease detection, and yield estimation through improved vegetation segmentation techniques.",
"### Discussion of Biases \n\nGiven the diverse sources of the images, there may be inherent biases towards certain crop types, geographical locations, and imaging conditions. Users should consider this diversity in applications and analyses.",
"### Licensing Information \n\nPlease refer to the specific licensing agreements of the contributing institutions or contact the dataset providers for more information on usage rights and restrictions.\n\n \n\nIf you use the VegAnn dataset in your research, please cite the following:",
"## Additional Information\n\n- Dataset Curators: Simon Madec et al.\n- Version: 1.0\n- License: CC-BY\n- Contact: URL@URL"
] |
ea9210b6670c73051d127c5c65019bc42db9f47f | A chunked version of [guidelines dataset](https://huggingface.co/datasets/epfl-llm/guidelines)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ewhfef/chunked_guidelines | [
"region:us"
] | 2024-01-18T15:12:37+00:00 | {"dataset_info": {"features": [{"name": "clean_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 457444538, "num_examples": 50805}], "download_size": 223229933, "dataset_size": 457444538}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T18:42:04+00:00 | [] | [] | TAGS
#region-us
| A chunked version of guidelines dataset
More Information needed | [] | [
"TAGS\n#region-us \n"
] |
27c9727b15a62a48defd1813bfbcbeeb06ac0652 |
# Guanaco-1k -> Llama-2 Dataset
Subset of 1000 samples of the [`timdettmers/openassistant-guanaco`](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) dataset.
It has been transformed to match Llama 2's prompt format according to [how to prompt llama 2](https://huggingface.co/blog/llama2#how-to-prompt-llama-2).
[Colab notebook](https://colab.research.google.com/drive/10JdqSHT9vSQe7i9feIGlQsqTAxmHdnsu?usp=sharing). | federicopuy/guanaco-llama2-1k | [
"region:us"
] | 2024-01-18T15:38:27+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1572908, "num_examples": 1000}], "download_size": 923271, "dataset_size": 1572908}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T16:00:21+00:00 | [] | [] | TAGS
#region-us
|
# Guanaco-1k -> Llama-2 Dataset
Subset of 1000 samples of the 'timdettmers/openassistant-guanaco' dataset.
It has been transformed to match Llama 2's prompt format according to how to prompt llama 2.
Colab notebook. | [
"# Guanaco-1k -> Llama-2 Dataset\n\nSubset of 1000 samples of the 'timdettmers/openassistant-guanaco' dataset.\n\nIt has been transformed to match Llama 2's prompt format according to how to prompt llama 2.\n\nColab notebook."
] | [
"TAGS\n#region-us \n",
"# Guanaco-1k -> Llama-2 Dataset\n\nSubset of 1000 samples of the 'timdettmers/openassistant-guanaco' dataset.\n\nIt has been transformed to match Llama 2's prompt format according to how to prompt llama 2.\n\nColab notebook."
] |
b696bb7e40d511fd7d66b605c96796625a652bf0 | POLITERWRITE
Dataset for https://arxiv.org/pdf/2212.10190.pdf
Test data available soon. | jdustinwind/Polite | [
"arxiv:2212.10190",
"region:us"
] | 2024-01-18T15:47:45+00:00 | {} | 2024-01-18T15:53:58+00:00 | [
"2212.10190"
] | [] | TAGS
#arxiv-2212.10190 #region-us
| POLITERWRITE
Dataset for URL
Test data available soon. | [] | [
"TAGS\n#arxiv-2212.10190 #region-us \n"
] |
e2d9200602686bada5c63f3bd9cba2260f6db04b | This dataset contains the first 40k prompts from LAION/CC/SBU BLIP-Caption Concept-balanced 558K which we use for rapid testing of the Robin model setup on new compute.
This is based on the data used in LLaVA: https://github.com/haotian-liu/LLaVA/blob/main/docs/Data.md
This does about 150 iterations with a batch size of 256 to check checkpointing and final model save. | agi-collective/Robin-test-data | [
"region:us"
] | 2024-01-18T15:54:32+00:00 | {} | 2024-01-18T20:48:20+00:00 | [] | [] | TAGS
#region-us
| This dataset contains the first 40k prompts from LAION/CC/SBU BLIP-Caption Concept-balanced 558K which we use for rapid testing of the Robin model setup on new compute.
This is based on the data used in LLaVA: URL
This does about 150 iterations with a batch size of 256 to check checkpointing and final model save. | [] | [
"TAGS\n#region-us \n"
] |
eb00eb307dce7dd8b7d6aa2fa3c879a0eb6c1c45 | # Dataset Card for "wsd_myriade_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_v2 | [
"region:us"
] | 2024-01-18T16:02:20+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 113146404, "num_examples": 134856}], "download_size": 14670365, "dataset_size": 113146404}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T16:02:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_v2"
More Information needed | [
"# Dataset Card for \"wsd_myriade_v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_v2\"\n\nMore Information needed"
] |
d5c0afb6ce62cd886a0d02cfdcdd18d2077a0d0d |
Sample with the keyword "Technology" taken from https://huggingface.co/datasets/fabiochiu/medium-articles
| bunkalab/medium-sample-technology | [
"license:apache-2.0",
"region:us"
] | 2024-01-18T16:08:03+00:00 | {"license": "apache-2.0"} | 2024-01-18T16:16:44+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
Sample with the keyword "Technology" taken from URL
| [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
d90debada4864ca72b76e2c81996e434625a8816 |
# Phrase Detectives Version 3
- **Homepage:** [CoNLL-2012 Shared Task](https://conll.cemantix.org/2012/data.html), [Author's page](https://cemantix.org/data/ontonotes.html)
- **Repository:** [Mendeley](https://data.mendeley.com/datasets/zmycy7t9h9)
- **Conversion:** https://github.com/vdobrovolskii/wl-coref/commit/4af0aa04eefad5b68a1fb6ca48a846a449bfa4b0
## Details
The original consistuency parse annotations of `coref-data/conll2012_raw` converted to conllu dependency parses using `convert_to_heads.py` from https://github.com/vdobrovolskii/wl-coref
## Citations
```
@inproceedings{dobrovolskii-2021-word,
title = "Word-Level Coreference Resolution",
author = "Dobrovolskii, Vladimir",
editor = "Moens, Marie-Francine and
Huang, Xuanjing and
Specia, Lucia and
Yih, Scott Wen-tau",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.605",
doi = "10.18653/v1/2021.emnlp-main.605",
pages = "7670--7675",
abstract = "Recent coreference resolution models rely heavily on span representations to find coreference links between word spans. As the number of spans is $O(n^2)$ in the length of text and the number of potential links is $O(n^4)$, various pruning techniques are necessary to make this approach computationally feasible. We propose instead to consider coreference links between individual words rather than word spans and then reconstruct the word spans. This reduces the complexity of the coreference model to $O(n^2)$ and allows it to consider all potential mentions without pruning any of them out. We also demonstrate that, with these changes, SpanBERT for coreference resolution will be significantly outperformed by RoBERTa. While being highly efficient, our model performs competitively with recent coreference resolution systems on the OntoNotes benchmark.",
}
@inproceedings{pradhan-etal-2013-towards,
title = "Towards Robust Linguistic Analysis using {O}nto{N}otes",
author = {Pradhan, Sameer and
Moschitti, Alessandro and
Xue, Nianwen and
Ng, Hwee Tou and
Bj{\"o}rkelund, Anders and
Uryupina, Olga and
Zhang, Yuchen and
Zhong, Zhi},
booktitle = "Proceedings of the Seventeenth Conference on Computational Natural Language Learning",
month = aug,
year = "2013",
address = "Sofia, Bulgaria",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W13-3516",
pages = "143--152",
}
``` | coref-data/conll2012_conllu | [
"license:other",
"region:us"
] | 2024-01-18T16:10:28+00:00 | {"license": "other"} | 2024-01-19T00:03:41+00:00 | [] | [] | TAGS
#license-other #region-us
|
# Phrase Detectives Version 3
- Homepage: CoNLL-2012 Shared Task, Author's page
- Repository: Mendeley
- Conversion: URL
## Details
The original consistuency parse annotations of 'coref-data/conll2012_raw' converted to conllu dependency parses using 'convert_to_heads.py' from URL
s
| [
"# Phrase Detectives Version 3\n\n- Homepage: CoNLL-2012 Shared Task, Author's page\n- Repository: Mendeley\n- Conversion: URL",
"## Details\n\nThe original consistuency parse annotations of 'coref-data/conll2012_raw' converted to conllu dependency parses using 'convert_to_heads.py' from URL\n\ns"
] | [
"TAGS\n#license-other #region-us \n",
"# Phrase Detectives Version 3\n\n- Homepage: CoNLL-2012 Shared Task, Author's page\n- Repository: Mendeley\n- Conversion: URL",
"## Details\n\nThe original consistuency parse annotations of 'coref-data/conll2012_raw' converted to conllu dependency parses using 'convert_to_heads.py' from URL\n\ns"
] |
1e9ccec49b3a8b6b345ce79683670c39a7d5b42b | ### About dataset
It is a dataset of multispeaker speech with noise
each sample is at most 30 seconds
### Loading script
```
>>> data_files = {"train": "data/<your_subset>.parquet"}
>>> data = load_dataset("Zarakun/speakers_ua_test", data_files=data_files)
>>> data
DatasetDict({
test: Dataset({
features: ['num_speakers', 'utter', 'audio'],
num_rows: <some_number>
})
})
```
### Dataset structure
Every example has the following:
**num_speakers** - the number of speakers
**utter** - list of utterences data
**audio** - the waveform of the audio
Each entry in the **utter** is a dict and has the following structure:
**start** - the starting position in **audio** of the speaker audio
**end** - the ending position in **audio** of the speaker audio
**file_id** - the identificator of the speaker
**sentence** - the transcription
**rate** - has to be the same across all examples in the dataset, if not something bad happened | Zarakun/speakers_ua_test | [
"audio",
"region:us"
] | 2024-01-18T16:17:30+00:00 | {"tags": ["audio"], "configs": [{"config_name": "test", "data_files": "data/test.parquet"}, {"config_name": "small_test", "data_files": "data/small_test.parquet"}]} | 2024-01-18T18:47:51+00:00 | [] | [] | TAGS
#audio #region-us
| ### About dataset
It is a dataset of multispeaker speech with noise
each sample is at most 30 seconds
### Loading script
### Dataset structure
Every example has the following:
num_speakers - the number of speakers
utter - list of utterences data
audio - the waveform of the audio
Each entry in the utter is a dict and has the following structure:
start - the starting position in audio of the speaker audio
end - the ending position in audio of the speaker audio
file_id - the identificator of the speaker
sentence - the transcription
rate - has to be the same across all examples in the dataset, if not something bad happened | [
"### About dataset\nIt is a dataset of multispeaker speech with noise \neach sample is at most 30 seconds",
"### Loading script",
"### Dataset structure\nEvery example has the following: \nnum_speakers - the number of speakers \nutter - list of utterences data \naudio - the waveform of the audio \n\nEach entry in the utter is a dict and has the following structure: \nstart - the starting position in audio of the speaker audio \nend - the ending position in audio of the speaker audio \nfile_id - the identificator of the speaker \nsentence - the transcription \nrate - has to be the same across all examples in the dataset, if not something bad happened"
] | [
"TAGS\n#audio #region-us \n",
"### About dataset\nIt is a dataset of multispeaker speech with noise \neach sample is at most 30 seconds",
"### Loading script",
"### Dataset structure\nEvery example has the following: \nnum_speakers - the number of speakers \nutter - list of utterences data \naudio - the waveform of the audio \n\nEach entry in the utter is a dict and has the following structure: \nstart - the starting position in audio of the speaker audio \nend - the ending position in audio of the speaker audio \nfile_id - the identificator of the speaker \nsentence - the transcription \nrate - has to be the same across all examples in the dataset, if not something bad happened"
] |
e5a194ba9c53da01acfb8c79589811fb87d7b9cf |
# Wingrande Recast as Coreference Resolution
### Dataset Summary
WinoGrande train and development sets recast as coreference resolution as described in [Investigating Failures to Generalize for Coreference Resolution Models](https://arxiv.org/abs/2303.09092). Conllu columns are parsed using Stanza.
### Data Fields
```python
{
"id": str, # example id
"text": str, # untokenized example text
"sentences": [
{
"id": int, # sentence index
"text": str, # untokenized sentence text
"speaker": None, # speaker
"tokens": [
{
# keys are conllu columns: id, text, lemma, upos, xpos, feats, head, deprel, deps, misc
},
...
]
},
...
],
"coref_chains": List[List[List[int]]], # list of clusters, each cluster is a list of mentions, each mention is a span represented as [sent, start, end] inclusive
"genre": "crowdsourced",
"meta_data": {
"comment": "syntax_annotations=stanza|tokenizer=stanza|detokenizer=nltk",
},
}
```
### Citation Information
```
@misc{porada2023investigating,
title={Investigating Failures to Generalize for Coreference Resolution Models},
author={Ian Porada and Alexandra Olteanu and Kaheer Suleman and Adam Trischler and Jackie Chi Kit Cheung},
year={2023},
eprint={2303.09092},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@InProceedings{ai2:winogrande,
title = {WinoGrande: An Adversarial Winograd Schema Challenge at Scale},
authors={Keisuke, Sakaguchi and Ronan, Le Bras and Chandra, Bhagavatula and Yejin, Choi
},
year={2019}
}
```
| coref-data/winogrande_coref | [
"license:cc-by-4.0",
"arxiv:2303.09092",
"region:us"
] | 2024-01-18T16:27:12+00:00 | {"license": "cc-by-4.0"} | 2024-01-19T00:03:44+00:00 | [
"2303.09092"
] | [] | TAGS
#license-cc-by-4.0 #arxiv-2303.09092 #region-us
|
# Wingrande Recast as Coreference Resolution
### Dataset Summary
WinoGrande train and development sets recast as coreference resolution as described in Investigating Failures to Generalize for Coreference Resolution Models. Conllu columns are parsed using Stanza.
### Data Fields
| [
"# Wingrande Recast as Coreference Resolution",
"### Dataset Summary\n\nWinoGrande train and development sets recast as coreference resolution as described in Investigating Failures to Generalize for Coreference Resolution Models. Conllu columns are parsed using Stanza.",
"### Data Fields"
] | [
"TAGS\n#license-cc-by-4.0 #arxiv-2303.09092 #region-us \n",
"# Wingrande Recast as Coreference Resolution",
"### Dataset Summary\n\nWinoGrande train and development sets recast as coreference resolution as described in Investigating Failures to Generalize for Coreference Resolution Models. Conllu columns are parsed using Stanza.",
"### Data Fields"
] |
ca15ae5c70dd7685e59293ade6e16cafe8a18c9f | Dataset used for Real Gurls models.
NSFW trans females. | graizelle/real_gurls_data | [
"size_categories:n<1K",
"license:cc-by-sa-4.0",
"tgirl",
"trans-female",
"region:us"
] | 2024-01-18T16:29:14+00:00 | {"license": "cc-by-sa-4.0", "size_categories": ["n<1K"], "pretty_name": "Real Gurls dataset", "tags": ["tgirl", "trans-female"]} | 2024-01-18T16:33:51+00:00 | [] | [] | TAGS
#size_categories-n<1K #license-cc-by-sa-4.0 #tgirl #trans-female #region-us
| Dataset used for Real Gurls models.
NSFW trans females. | [] | [
"TAGS\n#size_categories-n<1K #license-cc-by-sa-4.0 #tgirl #trans-female #region-us \n"
] |
7777e88efdfb21fcb87c2b42639b672a0ef7a4f3 |
# Dataset Card for Evaluation run of AA051611/A0118
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/A0118](https://huggingface.co/AA051611/A0118) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__A0118",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T23:48:21.810095](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0118/blob/main/results_2024-01-18T23-48-21.810095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6750935567286499,
"acc_stderr": 0.03150224444254494,
"acc_norm": 0.6839013238259298,
"acc_norm_stderr": 0.03214560635872275,
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343144,
"mc2": 0.5579325936654852,
"mc2_stderr": 0.015526306494139296
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642476,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.0143610972884497
},
"harness|hellaswag|10": {
"acc": 0.6517625970922127,
"acc_stderr": 0.004754380554929216,
"acc_norm": 0.8378809002190799,
"acc_norm_stderr": 0.0036780679944244557
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026782,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026782
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853137,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853137
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7256410256410256,
"acc_stderr": 0.022622765767493214,
"acc_norm": 0.7256410256410256,
"acc_norm_stderr": 0.022622765767493214
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630882,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630882
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.013802780227377342,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.013802780227377342
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.0313217980308329,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.0313217980308329
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179337,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179337
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8569604086845466,
"acc_stderr": 0.012520023176796515,
"acc_norm": 0.8569604086845466,
"acc_norm_stderr": 0.012520023176796515
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.0239291555173513,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.0239291555173513
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5071707953063885,
"acc_stderr": 0.012768922739553311,
"acc_norm": 0.5071707953063885,
"acc_norm_stderr": 0.012768922739553311
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.026040662474201264,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.026040662474201264
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7238562091503268,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.7238562091503268,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174917,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174917
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343144,
"mc2": 0.5579325936654852,
"mc2_stderr": 0.015526306494139296
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774083
},
"harness|gsm8k|5": {
"acc": 0.26383623957543595,
"acc_stderr": 0.012139386425126807
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051611__A0118 | [
"region:us"
] | 2024-01-18T16:45:10+00:00 | {"pretty_name": "Evaluation run of AA051611/A0118", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/A0118](https://huggingface.co/AA051611/A0118) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__A0118\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T23:48:21.810095](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0118/blob/main/results_2024-01-18T23-48-21.810095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6750935567286499,\n \"acc_stderr\": 0.03150224444254494,\n \"acc_norm\": 0.6839013238259298,\n \"acc_norm_stderr\": 0.03214560635872275,\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.01707823074343144,\n \"mc2\": 0.5579325936654852,\n \"mc2_stderr\": 0.015526306494139296\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642476,\n \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.0143610972884497\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6517625970922127,\n \"acc_stderr\": 0.004754380554929216,\n \"acc_norm\": 0.8378809002190799,\n \"acc_norm_stderr\": 0.0036780679944244557\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.03878352372138622,\n \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.03878352372138622\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5952380952380952,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.5952380952380952,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.02203721734026782,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.02203721734026782\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406795,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406795\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853137,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853137\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7256410256410256,\n \"acc_stderr\": 0.022622765767493214,\n \"acc_norm\": 0.7256410256410256,\n \"acc_norm_stderr\": 0.022622765767493214\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630882,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630882\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176896,\n \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176896\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8825688073394495,\n \"acc_stderr\": 0.013802780227377342,\n \"acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.013802780227377342\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.0313217980308329,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.0313217980308329\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460295,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460295\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179337,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179337\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8569604086845466,\n \"acc_stderr\": 0.012520023176796515,\n \"acc_norm\": 0.8569604086845466,\n \"acc_norm_stderr\": 0.012520023176796515\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.0239291555173513,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.0239291555173513\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5071707953063885,\n \"acc_stderr\": 0.012768922739553311,\n \"acc_norm\": 0.5071707953063885,\n \"acc_norm_stderr\": 0.012768922739553311\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.026040662474201264,\n \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.026040662474201264\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7238562091503268,\n \"acc_stderr\": 0.018087276935663137,\n \"acc_norm\": 0.7238562091503268,\n \"acc_norm_stderr\": 0.018087276935663137\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174917,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174917\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n \"mc1_stderr\": 0.01707823074343144,\n \"mc2\": 0.5579325936654852,\n \"mc2_stderr\": 0.015526306494139296\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774083\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26383623957543595,\n \"acc_stderr\": 0.012139386425126807\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/A0118", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|arc:challenge|25_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|arc:challenge|25_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|gsm8k|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|gsm8k|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hellaswag|10_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hellaswag|10_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T16-42-56.875464.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["**/details_harness|winogrande|5_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["**/details_harness|winogrande|5_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T23-48-21.810095.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T16_42_56.875464", "path": ["results_2024-01-18T16-42-56.875464.parquet"]}, {"split": "2024_01_18T23_48_21.810095", "path": ["results_2024-01-18T23-48-21.810095.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T23-48-21.810095.parquet"]}]}]} | 2024-01-18T23:50:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051611/A0118
Dataset automatically created during the evaluation run of model AA051611/A0118 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T23:48:21.810095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051611/A0118\n\n\n\nDataset automatically created during the evaluation run of model AA051611/A0118 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T23:48:21.810095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051611/A0118\n\n\n\nDataset automatically created during the evaluation run of model AA051611/A0118 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T23:48:21.810095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2f1cae622380f1da1939dea73be3753bf51eb7bd |
# Dataset Card for Evaluation run of cloudyu/Pluto_24B_DPO_200
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Pluto_24B_DPO_200](https://huggingface.co/cloudyu/Pluto_24B_DPO_200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T17:18:01.366806](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200/blob/main/results_2024-01-18T17-18-01.366806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6487883183265996,
"acc_stderr": 0.03206766377553213,
"acc_norm": 0.649809388886223,
"acc_norm_stderr": 0.03271483221046768,
"mc1": 0.5128518971848225,
"mc1_stderr": 0.017497717944299822,
"mc2": 0.6986184584005906,
"mc2_stderr": 0.014631943760685329
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955003,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6717785301732723,
"acc_stderr": 0.004686062421158146,
"acc_norm": 0.8637721569408484,
"acc_norm_stderr": 0.0034232928816321398
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305526,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305526
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993457,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993457
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667888,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667888
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922438,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922438
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5128518971848225,
"mc1_stderr": 0.017497717944299822,
"mc2": 0.6986184584005906,
"mc2_stderr": 0.014631943760685329
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710683
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831497
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200 | [
"region:us"
] | 2024-01-18T17:20:18+00:00 | {"pretty_name": "Evaluation run of cloudyu/Pluto_24B_DPO_200", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Pluto_24B_DPO_200](https://huggingface.co/cloudyu/Pluto_24B_DPO_200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T17:18:01.366806](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Pluto_24B_DPO_200/blob/main/results_2024-01-18T17-18-01.366806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6487883183265996,\n \"acc_stderr\": 0.03206766377553213,\n \"acc_norm\": 0.649809388886223,\n \"acc_norm_stderr\": 0.03271483221046768,\n \"mc1\": 0.5128518971848225,\n \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.6986184584005906,\n \"mc2_stderr\": 0.014631943760685329\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955003,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6717785301732723,\n \"acc_stderr\": 0.004686062421158146,\n \"acc_norm\": 0.8637721569408484,\n \"acc_norm_stderr\": 0.0034232928816321398\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305526,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305526\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993457,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993457\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667888,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667888\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.012750151802922438,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.012750151802922438\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n \"mc1_stderr\": 0.017497717944299822,\n \"mc2\": 0.6986184584005906,\n \"mc2_stderr\": 0.014631943760685329\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710683\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \"acc_stderr\": 0.013059111935831497\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Pluto_24B_DPO_200", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|arc:challenge|25_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|gsm8k|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hellaswag|10_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["**/details_harness|winogrande|5_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T17-18-01.366806.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T17_18_01.366806", "path": ["results_2024-01-18T17-18-01.366806.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T17-18-01.366806.parquet"]}]}]} | 2024-01-18T17:20:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Pluto_24B_DPO_200
Dataset automatically created during the evaluation run of model cloudyu/Pluto_24B_DPO_200 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T17:18:01.366806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Pluto_24B_DPO_200\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Pluto_24B_DPO_200 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T17:18:01.366806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Pluto_24B_DPO_200\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Pluto_24B_DPO_200 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T17:18:01.366806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cb090636fa991e535e3e90d1fa31cf7767857202 | # Dataset Card for "IIT-AFF-Dataset-Modified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | HCIE/IIT-AFF-Dataset-Modified | [
"region:us"
] | 2024-01-18T17:36:27+00:00 | {"dataset_info": {"features": [{"name": "original_image_path", "dtype": "image"}, {"name": "modified_image_path", "dtype": "image"}, {"name": "image_modification_prompt", "dtype": "string"}, {"name": "data_set", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 573043214.09, "num_examples": 29910}], "download_size": 380706160, "dataset_size": 573043214.09}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T17:36:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "IIT-AFF-Dataset-Modified"
More Information needed | [
"# Dataset Card for \"IIT-AFF-Dataset-Modified\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"IIT-AFF-Dataset-Modified\"\n\nMore Information needed"
] |
083e245106c395e4efe0d6371eab3be0680b2e96 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card has 6 .h5 and 6 .mat files that are used by the crowd counting demo
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Rootstrap
- **License:** MIT
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://www.kaggle.com/datasets/tthien/shanghaitech
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The dataset is used for the demo
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset is intended to use only for the demo
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
This dataset was not used for training the model and can not be used for training a new model as it is very limited.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
The dataset consist of 6 .h5 and 6 .mat.
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
This dataset was created for the demo
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
This .mat files where obtained from the ShangaiTech Dataset and the .h5 were generated from the .mat files using python.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
The ShangaiTech Dataset part B has 400 images. From this original dataset, 6 random files were gathered.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
@inproceedings{zhang2016single, title={Single-image crowd counting via multi-column convolutional neural network}, author={Zhang, Yingying and Zhou, Desen and Chen, Siqin and Gao, Shenghua and Ma, Yi}, booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition}, pages={589--597}, year={2016} }
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
As we stated before, this dataset is only ment to be used for the demo and cannot be reproduced in any way.
## Dataset Card Authors
Rootstrap
## Dataset Card Contact
[email protected] | rootstrap-org/crowd-counting | [
"task_categories:object-detection",
"size_categories:n<1K",
"language:en",
"license:mit",
"crowd-counting",
"cnn",
"detection",
"region:us"
] | 2024-01-18T18:09:53+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["object-detection"], "pretty_name": "crowd counting", "tags": ["crowd-counting", "cnn", "detection"]} | 2024-02-01T13:44:01+00:00 | [] | [
"en"
] | TAGS
#task_categories-object-detection #size_categories-n<1K #language-English #license-mit #crowd-counting #cnn #detection #region-us
|
# Dataset Card for Dataset Name
This dataset card has 6 .h5 and 6 .mat files that are used by the crowd counting demo
## Dataset Details
### Dataset Description
- Curated by: Rootstrap
- License: MIT
### Dataset Sources
- Repository: URL
## Uses
The dataset is used for the demo
### Direct Use
This dataset is intended to use only for the demo
### Out-of-Scope Use
This dataset was not used for training the model and can not be used for training a new model as it is very limited.
## Dataset Structure
The dataset consist of 6 .h5 and 6 .mat.
## Dataset Creation
### Curation Rationale
This dataset was created for the demo
### Source Data
This .mat files where obtained from the ShangaiTech Dataset and the .h5 were generated from the .mat files using python.
#### Data Collection and Processing
The ShangaiTech Dataset part B has 400 images. From this original dataset, 6 random files were gathered.
#### Who are the source data producers?
@inproceedings{zhang2016single, title={Single-image crowd counting via multi-column convolutional neural network}, author={Zhang, Yingying and Zhou, Desen and Chen, Siqin and Gao, Shenghua and Ma, Yi}, booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition}, pages={589--597}, year={2016} }
## Bias, Risks, and Limitations
As we stated before, this dataset is only ment to be used for the demo and cannot be reproduced in any way.
## Dataset Card Authors
Rootstrap
## Dataset Card Contact
info@URL | [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card has 6 .h5 and 6 .mat files that are used by the crowd counting demo",
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: Rootstrap\n- License: MIT",
"### Dataset Sources \n\n\n\n- Repository: URL",
"## Uses\n\n\n\nThe dataset is used for the demo",
"### Direct Use\n\n\n\nThis dataset is intended to use only for the demo",
"### Out-of-Scope Use\n\n\n\nThis dataset was not used for training the model and can not be used for training a new model as it is very limited.",
"## Dataset Structure\n\n\n\nThe dataset consist of 6 .h5 and 6 .mat.",
"## Dataset Creation",
"### Curation Rationale\n\n\n\nThis dataset was created for the demo",
"### Source Data\n\n\n\nThis .mat files where obtained from the ShangaiTech Dataset and the .h5 were generated from the .mat files using python.",
"#### Data Collection and Processing\n\n\n\nThe ShangaiTech Dataset part B has 400 images. From this original dataset, 6 random files were gathered.",
"#### Who are the source data producers?\n\n\n\n@inproceedings{zhang2016single, title={Single-image crowd counting via multi-column convolutional neural network}, author={Zhang, Yingying and Zhou, Desen and Chen, Siqin and Gao, Shenghua and Ma, Yi}, booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition}, pages={589--597}, year={2016} }",
"## Bias, Risks, and Limitations\n\n\n\nAs we stated before, this dataset is only ment to be used for the demo and cannot be reproduced in any way.",
"## Dataset Card Authors\n\nRootstrap",
"## Dataset Card Contact\n\ninfo@URL"
] | [
"TAGS\n#task_categories-object-detection #size_categories-n<1K #language-English #license-mit #crowd-counting #cnn #detection #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card has 6 .h5 and 6 .mat files that are used by the crowd counting demo",
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: Rootstrap\n- License: MIT",
"### Dataset Sources \n\n\n\n- Repository: URL",
"## Uses\n\n\n\nThe dataset is used for the demo",
"### Direct Use\n\n\n\nThis dataset is intended to use only for the demo",
"### Out-of-Scope Use\n\n\n\nThis dataset was not used for training the model and can not be used for training a new model as it is very limited.",
"## Dataset Structure\n\n\n\nThe dataset consist of 6 .h5 and 6 .mat.",
"## Dataset Creation",
"### Curation Rationale\n\n\n\nThis dataset was created for the demo",
"### Source Data\n\n\n\nThis .mat files where obtained from the ShangaiTech Dataset and the .h5 were generated from the .mat files using python.",
"#### Data Collection and Processing\n\n\n\nThe ShangaiTech Dataset part B has 400 images. From this original dataset, 6 random files were gathered.",
"#### Who are the source data producers?\n\n\n\n@inproceedings{zhang2016single, title={Single-image crowd counting via multi-column convolutional neural network}, author={Zhang, Yingying and Zhou, Desen and Chen, Siqin and Gao, Shenghua and Ma, Yi}, booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition}, pages={589--597}, year={2016} }",
"## Bias, Risks, and Limitations\n\n\n\nAs we stated before, this dataset is only ment to be used for the demo and cannot be reproduced in any way.",
"## Dataset Card Authors\n\nRootstrap",
"## Dataset Card Contact\n\ninfo@URL"
] |
433d7283a329e0aea752080c778a2db1f7aae856 |
# Dataset Card for Evaluation run of leveldevai/TurdusBeagle-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leveldevai/TurdusBeagle-7B](https://huggingface.co/leveldevai/TurdusBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leveldevai__TurdusBeagle-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T18:27:55.293799](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__TurdusBeagle-7B/blob/main/results_2024-01-18T18-27-55.293799.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533770356186305,
"acc_stderr": 0.032071476577749926,
"acc_norm": 0.6525881962766505,
"acc_norm_stderr": 0.032742193158041825,
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6971449186129537,
"mc2_stderr": 0.015083616284271144
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.01287592915129704
},
"harness|hellaswag|10": {
"acc": 0.722266480780721,
"acc_stderr": 0.004469659042824774,
"acc_norm": 0.8888667596096396,
"acc_norm_stderr": 0.0031365472766898906
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990334,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990334
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.576499388004896,
"mc1_stderr": 0.01729742144853475,
"mc2": 0.6971449186129537,
"mc2_stderr": 0.015083616284271144
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785722
},
"harness|gsm8k|5": {
"acc": 0.7005307050796058,
"acc_stderr": 0.012616300735519654
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_leveldevai__TurdusBeagle-7B | [
"region:us"
] | 2024-01-18T18:30:14+00:00 | {"pretty_name": "Evaluation run of leveldevai/TurdusBeagle-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [leveldevai/TurdusBeagle-7B](https://huggingface.co/leveldevai/TurdusBeagle-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__TurdusBeagle-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T18:27:55.293799](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__TurdusBeagle-7B/blob/main/results_2024-01-18T18-27-55.293799.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533770356186305,\n \"acc_stderr\": 0.032071476577749926,\n \"acc_norm\": 0.6525881962766505,\n \"acc_norm_stderr\": 0.032742193158041825,\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6971449186129537,\n \"mc2_stderr\": 0.015083616284271144\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.01287592915129704\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.722266480780721,\n \"acc_stderr\": 0.004469659042824774,\n \"acc_norm\": 0.8888667596096396,\n \"acc_norm_stderr\": 0.0031365472766898906\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.01358661921990334,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.01358661921990334\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.576499388004896,\n \"mc1_stderr\": 0.01729742144853475,\n \"mc2\": 0.6971449186129537,\n \"mc2_stderr\": 0.015083616284271144\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \"acc_stderr\": 0.012616300735519654\n }\n}\n```", "repo_url": "https://huggingface.co/leveldevai/TurdusBeagle-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|arc:challenge|25_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|gsm8k|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hellaswag|10_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T18-27-55.293799.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["**/details_harness|winogrande|5_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T18-27-55.293799.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T18_27_55.293799", "path": ["results_2024-01-18T18-27-55.293799.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T18-27-55.293799.parquet"]}]}]} | 2024-01-18T18:30:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of leveldevai/TurdusBeagle-7B
Dataset automatically created during the evaluation run of model leveldevai/TurdusBeagle-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T18:27:55.293799(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of leveldevai/TurdusBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/TurdusBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T18:27:55.293799(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of leveldevai/TurdusBeagle-7B\n\n\n\nDataset automatically created during the evaluation run of model leveldevai/TurdusBeagle-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T18:27:55.293799(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
aec43cac1c1dbff32de8e66da6517ee6b29b7e8a |
## PUG: ImageNet
The PUG: ImageNet dataset contains 88,328 pre-rendered images based on Unreal Engine using 724 assets representing 151 ImageNet classes with 64 environments, 7 sizes, 9 textures, 18 different camera orientations, 18 different character orientations and 7 light intensities. In contrast to PUG: Animals, PUG: ImageNet was created by varying only a single factor at a time (which explains the lower number of images than PUG: Animals despite using more factors). The main purpose of this dataset is to provide a novel, useful benchmark, paralleling ImageNet, but for fine-grained evaluation of the robustness of image classifiers, along several factors of variation.
## LICENSE
The datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.
## Citing PUG
If you use one of the PUG datasets, please cite:
```
@misc{bordes2023pug,
title={PUG: Photorealistic and Semantically Controllable Synthetic Data for Representation Learning},
author={Florian Bordes and Shashank Shekhar and Mark Ibrahim and Diane Bouchacourt and Pascal Vincent and Ari S. Morcos},
year={2023},
eprint={2308.03977},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## To learn more about the PUG datasets:
Please visit the [website](https://pug.metademolab.com/) and the [github](https://github.com/facebookresearch/PUG) | facebook/PUG_ImageNet | [
"license:cc-by-nc-4.0",
"arxiv:2308.03977",
"region:us"
] | 2024-01-18T18:45:03+00:00 | {"license": "cc-by-nc-4.0", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "world_name", "dtype": "string"}, {"name": "character_name", "dtype": "string"}, {"name": "character_label", "dtype": "string"}, {"name": "character_rotation_yaw", "dtype": "int64"}, {"name": "character_rotation_roll", "dtype": "int64"}, {"name": "character_rotation_pitch", "dtype": "int64"}, {"name": "character_scale", "dtype": "float64"}, {"name": "camera_roll", "dtype": "int64"}, {"name": "camera_pitch", "dtype": "int64"}, {"name": "camera_yaw", "dtype": "int64"}, {"name": "character_texture", "dtype": "string"}, {"name": "scene_light", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 29382707151.112, "num_examples": 88328}], "download_size": 29358745565, "dataset_size": 29382707151.112}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T19:46:20+00:00 | [
"2308.03977"
] | [] | TAGS
#license-cc-by-nc-4.0 #arxiv-2308.03977 #region-us
|
## PUG: ImageNet
The PUG: ImageNet dataset contains 88,328 pre-rendered images based on Unreal Engine using 724 assets representing 151 ImageNet classes with 64 environments, 7 sizes, 9 textures, 18 different camera orientations, 18 different character orientations and 7 light intensities. In contrast to PUG: Animals, PUG: ImageNet was created by varying only a single factor at a time (which explains the lower number of images than PUG: Animals despite using more factors). The main purpose of this dataset is to provide a novel, useful benchmark, paralleling ImageNet, but for fine-grained evaluation of the robustness of image classifiers, along several factors of variation.
## LICENSE
The datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.
## Citing PUG
If you use one of the PUG datasets, please cite:
## To learn more about the PUG datasets:
Please visit the website and the github | [
"## PUG: ImageNet\nThe PUG: ImageNet dataset contains 88,328 pre-rendered images based on Unreal Engine using 724 assets representing 151 ImageNet classes with 64 environments, 7 sizes, 9 textures, 18 different camera orientations, 18 different character orientations and 7 light intensities. In contrast to PUG: Animals, PUG: ImageNet was created by varying only a single factor at a time (which explains the lower number of images than PUG: Animals despite using more factors). The main purpose of this dataset is to provide a novel, useful benchmark, paralleling ImageNet, but for fine-grained evaluation of the robustness of image classifiers, along several factors of variation.",
"## LICENSE\nThe datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.",
"## Citing PUG\nIf you use one of the PUG datasets, please cite:",
"## To learn more about the PUG datasets:\nPlease visit the website and the github"
] | [
"TAGS\n#license-cc-by-nc-4.0 #arxiv-2308.03977 #region-us \n",
"## PUG: ImageNet\nThe PUG: ImageNet dataset contains 88,328 pre-rendered images based on Unreal Engine using 724 assets representing 151 ImageNet classes with 64 environments, 7 sizes, 9 textures, 18 different camera orientations, 18 different character orientations and 7 light intensities. In contrast to PUG: Animals, PUG: ImageNet was created by varying only a single factor at a time (which explains the lower number of images than PUG: Animals despite using more factors). The main purpose of this dataset is to provide a novel, useful benchmark, paralleling ImageNet, but for fine-grained evaluation of the robustness of image classifiers, along several factors of variation.",
"## LICENSE\nThe datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.",
"## Citing PUG\nIf you use one of the PUG datasets, please cite:",
"## To learn more about the PUG datasets:\nPlease visit the website and the github"
] |
2e4e7153338202aa31542e6e9102e6f3cfa0b9ec |
### Liver Segmentation Datasets
This is a batch of 100 CT scans, where you can find the volumes (the scans) and their segmentation to train a deep learning model for image segmentation. | pycad/liver-segmentation-100 | [
"license:mit",
"medical",
"medical imaging",
"image segmentation",
"deep learning",
"machine learning",
"computer vision",
"healthcare",
"liver",
"liver segmentation",
"region:us"
] | 2024-01-18T19:10:46+00:00 | {"license": "mit", "tags": ["medical", "medical imaging", "image segmentation", "deep learning", "machine learning", "computer vision", "healthcare", "liver", "liver segmentation"]} | 2024-01-18T19:27:13+00:00 | [] | [] | TAGS
#license-mit #medical #medical imaging #image segmentation #deep learning #machine learning #computer vision #healthcare #liver #liver segmentation #region-us
|
### Liver Segmentation Datasets
This is a batch of 100 CT scans, where you can find the volumes (the scans) and their segmentation to train a deep learning model for image segmentation. | [
"### Liver Segmentation Datasets\nThis is a batch of 100 CT scans, where you can find the volumes (the scans) and their segmentation to train a deep learning model for image segmentation."
] | [
"TAGS\n#license-mit #medical #medical imaging #image segmentation #deep learning #machine learning #computer vision #healthcare #liver #liver segmentation #region-us \n",
"### Liver Segmentation Datasets\nThis is a batch of 100 CT scans, where you can find the volumes (the scans) and their segmentation to train a deep learning model for image segmentation."
] |
cf552f54d818123a80e446c59ec2c78d47518047 |
# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-3.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/Hermes-low-tune-3.1](https://huggingface.co/nlpguy/Hermes-low-tune-3.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T19:43:57.257898](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3.1/blob/main/results_2024-01-18T19-43-57.257898.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.643479250503199,
"acc_stderr": 0.0322077411976957,
"acc_norm": 0.6450125760938904,
"acc_norm_stderr": 0.032855183947534713,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5358664438298202,
"mc2_stderr": 0.015241994807091694
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349812,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145678
},
"harness|hellaswag|10": {
"acc": 0.6568412666799442,
"acc_stderr": 0.004737936758047634,
"acc_norm": 0.8460466042620992,
"acc_norm_stderr": 0.003601664838718933
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029196,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029196
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993459,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348397,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079069,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079069
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507215,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5358664438298202,
"mc2_stderr": 0.015241994807091694
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090255
},
"harness|gsm8k|5": {
"acc": 0.6345716451857468,
"acc_stderr": 0.013264282030266635
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3.1 | [
"region:us"
] | 2024-01-18T19:46:12+00:00 | {"pretty_name": "Evaluation run of nlpguy/Hermes-low-tune-3.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [nlpguy/Hermes-low-tune-3.1](https://huggingface.co/nlpguy/Hermes-low-tune-3.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T19:43:57.257898](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-3.1/blob/main/results_2024-01-18T19-43-57.257898.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.643479250503199,\n \"acc_stderr\": 0.0322077411976957,\n \"acc_norm\": 0.6450125760938904,\n \"acc_norm_stderr\": 0.032855183947534713,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5358664438298202,\n \"mc2_stderr\": 0.015241994807091694\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349812,\n \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145678\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6568412666799442,\n \"acc_stderr\": 0.004737936758047634,\n \"acc_norm\": 0.8460466042620992,\n \"acc_norm_stderr\": 0.003601664838718933\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029196,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029196\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n \"acc_stderr\": 0.015609929559348397,\n \"acc_norm\": 0.3206703910614525,\n \"acc_norm_stderr\": 0.015609929559348397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079069,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079069\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507215,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507215\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5358664438298202,\n \"mc2_stderr\": 0.015241994807091694\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090255\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6345716451857468,\n \"acc_stderr\": 0.013264282030266635\n }\n}\n```", "repo_url": "https://huggingface.co/nlpguy/Hermes-low-tune-3.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|arc:challenge|25_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|gsm8k|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hellaswag|10_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T19-43-57.257898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["**/details_harness|winogrande|5_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T19-43-57.257898.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T19_43_57.257898", "path": ["results_2024-01-18T19-43-57.257898.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T19-43-57.257898.parquet"]}]}]} | 2024-01-18T19:46:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-3.1
Dataset automatically created during the evaluation run of model nlpguy/Hermes-low-tune-3.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T19:43:57.257898(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-3.1\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/Hermes-low-tune-3.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T19:43:57.257898(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-3.1\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/Hermes-low-tune-3.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T19:43:57.257898(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e603e69901503122ebe4e2e1027d1b33d0601c32 |
# Dataset Card for Evaluation run of intervitens/internlm2-base-20b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [intervitens/internlm2-base-20b-llama](https://huggingface.co/intervitens/internlm2-base-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_intervitens__internlm2-base-20b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T19:55:30.426442](https://huggingface.co/datasets/open-llm-leaderboard/details_intervitens__internlm2-base-20b-llama/blob/main/results_2024-01-18T19-55-30.426442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6358012139707869,
"acc_stderr": 0.032345068549400245,
"acc_norm": 0.6407932610555195,
"acc_norm_stderr": 0.03299002256236763,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502342,
"mc2": 0.4411333942717491,
"mc2_stderr": 0.014265599043931959
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996081,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6154152559251145,
"acc_stderr": 0.00485502724839816,
"acc_norm": 0.8215494921330412,
"acc_norm_stderr": 0.003821090082721705
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548302,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.02686971618742991,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.02686971618742991
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.024433016466052462,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.024433016466052462
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291926,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.03995524007681681,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.03995524007681681
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976064,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716326,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457965,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457965
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489124,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489124
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.0248183501294366,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.0248183501294366
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310267,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218895,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218895
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.01275371692910101,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.01275371692910101
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988637,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988637
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304314,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502342,
"mc2": 0.4411333942717491,
"mc2_stderr": 0.014265599043931959
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.4488248673237301,
"acc_stderr": 0.013700157442788066
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_intervitens__internlm2-base-20b-llama | [
"region:us"
] | 2024-01-18T19:57:39+00:00 | {"pretty_name": "Evaluation run of intervitens/internlm2-base-20b-llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [intervitens/internlm2-base-20b-llama](https://huggingface.co/intervitens/internlm2-base-20b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_intervitens__internlm2-base-20b-llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T19:55:30.426442](https://huggingface.co/datasets/open-llm-leaderboard/details_intervitens__internlm2-base-20b-llama/blob/main/results_2024-01-18T19-55-30.426442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6358012139707869,\n \"acc_stderr\": 0.032345068549400245,\n \"acc_norm\": 0.6407932610555195,\n \"acc_norm_stderr\": 0.03299002256236763,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4411333942717491,\n \"mc2_stderr\": 0.014265599043931959\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996081,\n \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6154152559251145,\n \"acc_stderr\": 0.00485502724839816,\n \"acc_norm\": 0.8215494921330412,\n \"acc_norm_stderr\": 0.003821090082721705\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.03761070869867479,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.03761070869867479\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548302,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548302\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8282828282828283,\n \"acc_stderr\": 0.02686971618742991,\n \"acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.02686971618742991\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.024433016466052462,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.024433016466052462\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291926,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.03995524007681681,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.03995524007681681\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976064,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976064\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716326,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716326\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457965,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457965\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489124,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489124\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.0248183501294366,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.0248183501294366\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310267,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310267\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218895,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218895\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.01275371692910101,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.01275371692910101\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988637,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988637\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304314,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304314\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4411333942717491,\n \"mc2_stderr\": 0.014265599043931959\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4488248673237301,\n \"acc_stderr\": 0.013700157442788066\n }\n}\n```", "repo_url": "https://huggingface.co/intervitens/internlm2-base-20b-llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|arc:challenge|25_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|gsm8k|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hellaswag|10_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T19-55-30.426442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["**/details_harness|winogrande|5_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T19-55-30.426442.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T19_55_30.426442", "path": ["results_2024-01-18T19-55-30.426442.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T19-55-30.426442.parquet"]}]}]} | 2024-01-18T19:58:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of intervitens/internlm2-base-20b-llama
Dataset automatically created during the evaluation run of model intervitens/internlm2-base-20b-llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T19:55:30.426442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of intervitens/internlm2-base-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model intervitens/internlm2-base-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T19:55:30.426442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of intervitens/internlm2-base-20b-llama\n\n\n\nDataset automatically created during the evaluation run of model intervitens/internlm2-base-20b-llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T19:55:30.426442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
61844252d75a48d118dd997017a2e8ed0103058e |
# Dataset Card for Evaluation run of Kquant03/Prokaryote-8x7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Prokaryote-8x7B-bf16](https://huggingface.co/Kquant03/Prokaryote-8x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Prokaryote-8x7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T20:11:57.513943](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Prokaryote-8x7B-bf16/blob/main/results_2024-01-18T20-11-57.513943.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.655551112846195,
"acc_stderr": 0.03200857802460192,
"acc_norm": 0.6550894523163624,
"acc_norm_stderr": 0.03267273078447577,
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960867,
"mc2": 0.6778730144008733,
"mc2_stderr": 0.015193091234587739
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.7167894841665007,
"acc_stderr": 0.004496369742132105,
"acc_norm": 0.8817964548894642,
"acc_norm_stderr": 0.003221891726851491
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554956,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512624,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250437,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.01654240195463191,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.01654240195463191
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657473,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960867,
"mc2": 0.6778730144008733,
"mc2_stderr": 0.015193091234587739
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363698
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Prokaryote-8x7B-bf16 | [
"region:us"
] | 2024-01-18T20:14:16+00:00 | {"pretty_name": "Evaluation run of Kquant03/Prokaryote-8x7B-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Prokaryote-8x7B-bf16](https://huggingface.co/Kquant03/Prokaryote-8x7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Prokaryote-8x7B-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T20:11:57.513943](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Prokaryote-8x7B-bf16/blob/main/results_2024-01-18T20-11-57.513943.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655551112846195,\n \"acc_stderr\": 0.03200857802460192,\n \"acc_norm\": 0.6550894523163624,\n \"acc_norm_stderr\": 0.03267273078447577,\n \"mc1\": 0.5397796817625459,\n \"mc1_stderr\": 0.017448017223960867,\n \"mc2\": 0.6778730144008733,\n \"mc2_stderr\": 0.015193091234587739\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351335\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7167894841665007,\n \"acc_stderr\": 0.004496369742132105,\n \"acc_norm\": 0.8817964548894642,\n \"acc_norm_stderr\": 0.003221891726851491\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554956,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512624,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250437,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.01654240195463191,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.01654240195463191\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5397796817625459,\n \"mc1_stderr\": 0.017448017223960867,\n \"mc2\": 0.6778730144008733,\n \"mc2_stderr\": 0.015193091234587739\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363698\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \"acc_stderr\": 0.012679297549515427\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Prokaryote-8x7B-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|arc:challenge|25_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|gsm8k|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hellaswag|10_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T20-11-57.513943.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["**/details_harness|winogrande|5_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T20-11-57.513943.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T20_11_57.513943", "path": ["results_2024-01-18T20-11-57.513943.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T20-11-57.513943.parquet"]}]}]} | 2024-01-18T20:14:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Prokaryote-8x7B-bf16
Dataset automatically created during the evaluation run of model Kquant03/Prokaryote-8x7B-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T20:11:57.513943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Prokaryote-8x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Prokaryote-8x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T20:11:57.513943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Prokaryote-8x7B-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Prokaryote-8x7B-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T20:11:57.513943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
efabeed1a59c02a714a778358fcac7d17872915a | # SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
Compiled dataset for SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials.
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [https://github.com/ai-systems/Task-2-SemEval-2024]
- **Paper:** [https://aclanthology.org/2023.semeval-1.307/]
- **Demo:** [More Information Needed]
| AshtonIsNotHere/nli4ct_semeval2024 | [
"task_categories:text-classification",
"task_categories:sentence-similarity",
"language:en",
"medical",
"region:us"
] | 2024-01-18T20:21:48+00:00 | {"language": ["en"], "task_categories": ["text-classification", "sentence-similarity"], "pretty_name": "SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials", "tags": ["medical"]} | 2024-01-18T20:48:38+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #task_categories-sentence-similarity #language-English #medical #region-us
| # SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials
## Dataset Details
### Dataset Description
Compiled dataset for SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials.
### Dataset Sources
- Repository: [URL
- Paper: [URL
- Demo:
| [
"# SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials",
"## Dataset Details",
"### Dataset Description\n\nCompiled dataset for SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials.",
"### Dataset Sources\n\n\n\n- Repository: [URL\n- Paper: [URL\n- Demo:"
] | [
"TAGS\n#task_categories-text-classification #task_categories-sentence-similarity #language-English #medical #region-us \n",
"# SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials",
"## Dataset Details",
"### Dataset Description\n\nCompiled dataset for SemEval 2024 Task 2: Safe Biomedical Natural Language Inference for Clinical Trials.",
"### Dataset Sources\n\n\n\n- Repository: [URL\n- Paper: [URL\n- Demo:"
] |
3a5d642fd60156895cd2fb76ca2fc732d98d4f1a | ### Virtual Food Policy Coach Training Data
The Prompt and Completion (P&C) pairs used to train the *Virtual Food Policy Coach*, an AI chatbot developed by the International Food Policy Research Institute (IFPRI) using Coachvox AI.
Total of 1,350 P&Cs were generated from 21 documents using GPT-3.5.
### Resources
* Virtual Food Policy Coach <https://app.coachvox.ai/avatar/XDAFnkbo53vQvfzrsnO7/fullscreen>
* IFPRI <https://ifpri.org>
* Coachvox AI <https://coachvox.ai>
* Blog post at Agrilinks <https://agrilinks.org/post/creating-virtual-food-policy-expert-using-artificial-intelligence-advantages-and-disadvantages>
* Working Paper <https://hdl.handle.net/10568/137261>
---
license: cc-by-4.0
---
| cgiar-digital-innovation/virtual-food-policy-coach_training-data | [
"region:us"
] | 2024-01-18T20:25:23+00:00 | {} | 2024-01-19T17:10:28+00:00 | [] | [] | TAGS
#region-us
| ### Virtual Food Policy Coach Training Data
The Prompt and Completion (P&C) pairs used to train the *Virtual Food Policy Coach*, an AI chatbot developed by the International Food Policy Research Institute (IFPRI) using Coachvox AI.
Total of 1,350 P&Cs were generated from 21 documents using GPT-3.5.
### Resources
* Virtual Food Policy Coach <URL
* IFPRI <URL>
* Coachvox AI <URL>
* Blog post at Agrilinks <URL
* Working Paper <URL
---
license: cc-by-4.0
---
| [
"### Virtual Food Policy Coach Training Data\nThe Prompt and Completion (P&C) pairs used to train the *Virtual Food Policy Coach*, an AI chatbot developed by the International Food Policy Research Institute (IFPRI) using Coachvox AI. \nTotal of 1,350 P&Cs were generated from 21 documents using GPT-3.5.",
"### Resources\n* Virtual Food Policy Coach <URL\n* IFPRI <URL>\n* Coachvox AI <URL>\n* Blog post at Agrilinks <URL\n* Working Paper <URL\n\n---\nlicense: cc-by-4.0\n---"
] | [
"TAGS\n#region-us \n",
"### Virtual Food Policy Coach Training Data\nThe Prompt and Completion (P&C) pairs used to train the *Virtual Food Policy Coach*, an AI chatbot developed by the International Food Policy Research Institute (IFPRI) using Coachvox AI. \nTotal of 1,350 P&Cs were generated from 21 documents using GPT-3.5.",
"### Resources\n* Virtual Food Policy Coach <URL\n* IFPRI <URL>\n* Coachvox AI <URL>\n* Blog post at Agrilinks <URL\n* Working Paper <URL\n\n---\nlicense: cc-by-4.0\n---"
] |
c06d287045116b4ea056552d8c277ba4f9e75519 |
## PUG: SPAR
PUG: SPAR (Scene, Position, Attribute, Relation) contains 43,560 test samples, with image-caption pairs that evaluate VLMs scene and object recognition, as well as inter-object and object-attribute relationships respectively. We utilize scenes containing up to two objects in 4 unique spatial relationships and 4 different texture variations.
## LICENSE
The datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.
## Citing PUG
If you use one of the PUG datasets, please cite:
```
@misc{bordes2023pug,
title={PUG: Photorealistic and Semantically Controllable Synthetic Data for Representation Learning},
author={Florian Bordes and Shashank Shekhar and Mark Ibrahim and Diane Bouchacourt and Pascal Vincent and Ari S. Morcos},
year={2023},
eprint={2308.03977},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
## To learn more about the PUG datasets:
Please visit the [website](https://pug.metademolab.com/) and the [github](https://github.com/facebookresearch/PUG) | facebook/PUG_SPAR | [
"license:cc-by-nc-4.0",
"arxiv:2308.03977",
"region:us"
] | 2024-01-18T20:29:43+00:00 | {"license": "cc-by-nc-4.0", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "world_name", "dtype": "string"}, {"name": "character_name", "dtype": "string"}, {"name": "character2_name", "dtype": "string"}, {"name": "character1_pos", "dtype": "string"}, {"name": "character2_pos", "dtype": "string"}, {"name": "character_texture", "dtype": "string"}, {"name": "character2_texture", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17215863251.4, "num_examples": 43560}], "download_size": 17185543222, "dataset_size": 17215863251.4}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T20:45:23+00:00 | [
"2308.03977"
] | [] | TAGS
#license-cc-by-nc-4.0 #arxiv-2308.03977 #region-us
|
## PUG: SPAR
PUG: SPAR (Scene, Position, Attribute, Relation) contains 43,560 test samples, with image-caption pairs that evaluate VLMs scene and object recognition, as well as inter-object and object-attribute relationships respectively. We utilize scenes containing up to two objects in 4 unique spatial relationships and 4 different texture variations.
## LICENSE
The datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.
## Citing PUG
If you use one of the PUG datasets, please cite:
## To learn more about the PUG datasets:
Please visit the website and the github | [
"## PUG: SPAR\nPUG: SPAR (Scene, Position, Attribute, Relation) contains 43,560 test samples, with image-caption pairs that evaluate VLMs scene and object recognition, as well as inter-object and object-attribute relationships respectively. We utilize scenes containing up to two objects in 4 unique spatial relationships and 4 different texture variations.",
"## LICENSE\nThe datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.",
"## Citing PUG\nIf you use one of the PUG datasets, please cite:",
"## To learn more about the PUG datasets:\nPlease visit the website and the github"
] | [
"TAGS\n#license-cc-by-nc-4.0 #arxiv-2308.03977 #region-us \n",
"## PUG: SPAR\nPUG: SPAR (Scene, Position, Attribute, Relation) contains 43,560 test samples, with image-caption pairs that evaluate VLMs scene and object recognition, as well as inter-object and object-attribute relationships respectively. We utilize scenes containing up to two objects in 4 unique spatial relationships and 4 different texture variations.",
"## LICENSE\nThe datasets are distributed under the CC-BY-NC, with the addenda that they should not be used to train Generative AI models.",
"## Citing PUG\nIf you use one of the PUG datasets, please cite:",
"## To learn more about the PUG datasets:\nPlease visit the website and the github"
] |
d6e696631dddfbc9425b064582f7d7d7148744cd |
# DATACLYSM PATCH 0.0.4: PUBMED
## USE THE NOTEBOOK TO GET STARTED!
https://github.com/somewheresystems/dataclysm
# somewheresystems/dataclysm-pubmed
This dataset comprises of 35.7 million PubMed metadata entries including title and some (~69% with) abstracts, with two new columns added: title-embeddings and abstract-embeddings. These additional columns were generated using the bge-small-en-v1.5 embeddings model. The dataset was sourced from the PubMed Baseline as of December 12, 2023. https://ftp.ncbi.nlm.nih.gov/pubmed/baseline/
# Embeddings Model
We used https://huggingface.co/BAAI/bge-small-en-v1.5 to embed the `title` and `abstract` fields.
## Contact
Please contact [email protected] for inquiries. | somewheresystems/dataclysm-pubmed | [
"size_categories:10M<n<100M",
"language:en",
"license:apache-2.0",
"pubmed",
"medical",
"medicine",
"NIH",
"science",
"region:us"
] | 2024-01-18T20:54:56+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10M<n<100M"], "pretty_name": "dataclysm-arxiv", "tags": ["pubmed", "medical", "medicine", "NIH", "science"]} | 2024-02-01T01:46:14+00:00 | [] | [
"en"
] | TAGS
#size_categories-10M<n<100M #language-English #license-apache-2.0 #pubmed #medical #medicine #NIH #science #region-us
|
# DATACLYSM PATCH 0.0.4: PUBMED
## USE THE NOTEBOOK TO GET STARTED!
URL
# somewheresystems/dataclysm-pubmed
This dataset comprises of 35.7 million PubMed metadata entries including title and some (~69% with) abstracts, with two new columns added: title-embeddings and abstract-embeddings. These additional columns were generated using the bge-small-en-v1.5 embeddings model. The dataset was sourced from the PubMed Baseline as of December 12, 2023. URL
# Embeddings Model
We used URL to embed the 'title' and 'abstract' fields.
## Contact
Please contact hi@URL for inquiries. | [
"# DATACLYSM PATCH 0.0.4: PUBMED",
"## USE THE NOTEBOOK TO GET STARTED!\nURL",
"# somewheresystems/dataclysm-pubmed\n\nThis dataset comprises of 35.7 million PubMed metadata entries including title and some (~69% with) abstracts, with two new columns added: title-embeddings and abstract-embeddings. These additional columns were generated using the bge-small-en-v1.5 embeddings model. The dataset was sourced from the PubMed Baseline as of December 12, 2023. URL",
"# Embeddings Model\n\nWe used URL to embed the 'title' and 'abstract' fields.",
"## Contact\n\nPlease contact hi@URL for inquiries."
] | [
"TAGS\n#size_categories-10M<n<100M #language-English #license-apache-2.0 #pubmed #medical #medicine #NIH #science #region-us \n",
"# DATACLYSM PATCH 0.0.4: PUBMED",
"## USE THE NOTEBOOK TO GET STARTED!\nURL",
"# somewheresystems/dataclysm-pubmed\n\nThis dataset comprises of 35.7 million PubMed metadata entries including title and some (~69% with) abstracts, with two new columns added: title-embeddings and abstract-embeddings. These additional columns were generated using the bge-small-en-v1.5 embeddings model. The dataset was sourced from the PubMed Baseline as of December 12, 2023. URL",
"# Embeddings Model\n\nWe used URL to embed the 'title' and 'abstract' fields.",
"## Contact\n\nPlease contact hi@URL for inquiries."
] |
66b0875f1e7a0f4af2a48bc6720d158fe44c2056 |
# 8TAGS
### Dataset Summary
A Polish topic classification dataset consisting of headlines from social media posts. It contains about 50,000 sentences annotated with 8 topic labels: film, history, food, medicine, motorization, work, sport and technology. This dataset was created automatically by extracting sentences from headlines and short descriptions of articles posted on Polish social networking site **wykop.pl**. The service allows users to annotate articles with one or more tags (categories). Dataset represents a selection of article sentences from 8 popular categories. The resulting corpus contains cleaned and tokenized, unambiguous sentences (tagged with only one of the selected categories), and longer than 30 characters.
### Data Instances
Example instance:
```
{
"sentence": "Kierowca był nieco zdziwiony że podróżując sporo ponad 200 km / h zatrzymali go policjanci.",
"label": "4"
}
```
### Data Fields
- sentence: sentence text
- label: label identifier corresponding to one of 8 topics
### Citation Information
```
@inproceedings{dadas-etal-2020-evaluation,
title = "Evaluation of Sentence Representations in {P}olish",
author = "Dadas, Slawomir and Pere{\l}kiewicz, Micha{\l} and Po{\'s}wiata, Rafa{\l}",
booktitle = "Proceedings of the 12th Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.207",
pages = "1674--1680",
language = "English",
ISBN = "979-10-95546-34-4",
}
```
| djstrong/8tags | [
"task_categories:text-classification",
"task_ids:topic-classification",
"task_ids:multi-class-classification",
"multilinguality:monolingual",
"size_categories:10K<n<100K",
"language:pl",
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-01-18T21:07:54+00:00 | {"language": ["pl"], "license": ["cc-by-nc-sa-4.0"], "multilinguality": ["monolingual"], "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "task_ids": ["topic-classification", "multi-class-classification"], "pretty_name": "8TAGS", "dataset_info": {"features": [{"name": "sentence", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "film", "1": "history", "2": "food", "3": "medicine", "4": "motorization", "5": "work", "6": "sport", "7": "technology"}}}}], "splits": [{"name": "train", "num_bytes": 3765325, "num_examples": 40001}, {"name": "validation", "num_bytes": 467676, "num_examples": 5000}, {"name": "test", "num_bytes": 416311, "num_examples": 4372}]}} | 2024-01-18T21:18:22+00:00 | [] | [
"pl"
] | TAGS
#task_categories-text-classification #task_ids-topic-classification #task_ids-multi-class-classification #multilinguality-monolingual #size_categories-10K<n<100K #language-Polish #license-cc-by-nc-sa-4.0 #region-us
|
# 8TAGS
### Dataset Summary
A Polish topic classification dataset consisting of headlines from social media posts. It contains about 50,000 sentences annotated with 8 topic labels: film, history, food, medicine, motorization, work, sport and technology. This dataset was created automatically by extracting sentences from headlines and short descriptions of articles posted on Polish social networking site URL. The service allows users to annotate articles with one or more tags (categories). Dataset represents a selection of article sentences from 8 popular categories. The resulting corpus contains cleaned and tokenized, unambiguous sentences (tagged with only one of the selected categories), and longer than 30 characters.
### Data Instances
Example instance:
### Data Fields
- sentence: sentence text
- label: label identifier corresponding to one of 8 topics
| [
"# 8TAGS",
"### Dataset Summary\n\nA Polish topic classification dataset consisting of headlines from social media posts. It contains about 50,000 sentences annotated with 8 topic labels: film, history, food, medicine, motorization, work, sport and technology. This dataset was created automatically by extracting sentences from headlines and short descriptions of articles posted on Polish social networking site URL. The service allows users to annotate articles with one or more tags (categories). Dataset represents a selection of article sentences from 8 popular categories. The resulting corpus contains cleaned and tokenized, unambiguous sentences (tagged with only one of the selected categories), and longer than 30 characters.",
"### Data Instances\n\nExample instance:",
"### Data Fields\n\n- sentence: sentence text\n- label: label identifier corresponding to one of 8 topics"
] | [
"TAGS\n#task_categories-text-classification #task_ids-topic-classification #task_ids-multi-class-classification #multilinguality-monolingual #size_categories-10K<n<100K #language-Polish #license-cc-by-nc-sa-4.0 #region-us \n",
"# 8TAGS",
"### Dataset Summary\n\nA Polish topic classification dataset consisting of headlines from social media posts. It contains about 50,000 sentences annotated with 8 topic labels: film, history, food, medicine, motorization, work, sport and technology. This dataset was created automatically by extracting sentences from headlines and short descriptions of articles posted on Polish social networking site URL. The service allows users to annotate articles with one or more tags (categories). Dataset represents a selection of article sentences from 8 popular categories. The resulting corpus contains cleaned and tokenized, unambiguous sentences (tagged with only one of the selected categories), and longer than 30 characters.",
"### Data Instances\n\nExample instance:",
"### Data Fields\n\n- sentence: sentence text\n- label: label identifier corresponding to one of 8 topics"
] |
74242c7063b987d7e1f281afd306d9f488e093e0 |
# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freecs/ThetaWave-7B-v0](https://huggingface.co/freecs/ThetaWave-7B-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freecs__ThetaWave-7B-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T21:58:24.571103](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v0/blob/main/results_2024-01-18T21-58-24.571103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6139427832716801,
"acc_stderr": 0.0329772926223431,
"acc_norm": 0.6160760841456204,
"acc_norm_stderr": 0.033642251011914065,
"mc1": 0.47123623011015914,
"mc1_stderr": 0.01747451384852552,
"mc2": 0.6156128002968497,
"mc2_stderr": 0.015661943081276854
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.014077223108470142,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6622186815375424,
"acc_stderr": 0.004719870074967247,
"acc_norm": 0.8535152360087632,
"acc_norm_stderr": 0.0035286889976580537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165894,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072388,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072388
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295827,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295827
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162111,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162111
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.033907806129727755,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.033907806129727755
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707779,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.69,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.01396439376989912,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.01396439376989912
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624733,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624733
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.01657899743549671,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.01657899743549671
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.02685729466328141,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.02685729466328141
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811032,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811032
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598557,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598557
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.527363184079602,
"acc_stderr": 0.03530235517334683,
"acc_norm": 0.527363184079602,
"acc_norm_stderr": 0.03530235517334683
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47123623011015914,
"mc1_stderr": 0.01747451384852552,
"mc2": 0.6156128002968497,
"mc2_stderr": 0.015661943081276854
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.01131779878162692
},
"harness|gsm8k|5": {
"acc": 0.5481425322213799,
"acc_stderr": 0.013708494995677644
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_freecs__ThetaWave-7B-v0 | [
"region:us"
] | 2024-01-18T22:00:43+00:00 | {"pretty_name": "Evaluation run of freecs/ThetaWave-7B-v0", "dataset_summary": "Dataset automatically created during the evaluation run of model [freecs/ThetaWave-7B-v0](https://huggingface.co/freecs/ThetaWave-7B-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__ThetaWave-7B-v0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T21:58:24.571103](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v0/blob/main/results_2024-01-18T21-58-24.571103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6139427832716801,\n \"acc_stderr\": 0.0329772926223431,\n \"acc_norm\": 0.6160760841456204,\n \"acc_norm_stderr\": 0.033642251011914065,\n \"mc1\": 0.47123623011015914,\n \"mc1_stderr\": 0.01747451384852552,\n \"mc2\": 0.6156128002968497,\n \"mc2_stderr\": 0.015661943081276854\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.014077223108470142,\n \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6622186815375424,\n \"acc_stderr\": 0.004719870074967247,\n \"acc_norm\": 0.8535152360087632,\n \"acc_norm_stderr\": 0.0035286889976580537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165894,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072388,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072388\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647078,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647078\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295827,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295827\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162111,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162111\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.033907806129727755,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.033907806129727755\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.02250903393707779,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.02250903393707779\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.01396439376989912,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.01396439376989912\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624733,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624733\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n \"acc_stderr\": 0.01657899743549671,\n \"acc_norm\": 0.4346368715083799,\n \"acc_norm_stderr\": 0.01657899743549671\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.02685729466328141,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.02685729466328141\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811032,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811032\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.012729785386598557,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.012729785386598557\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.527363184079602,\n \"acc_stderr\": 0.03530235517334683,\n \"acc_norm\": 0.527363184079602,\n \"acc_norm_stderr\": 0.03530235517334683\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47123623011015914,\n \"mc1_stderr\": 0.01747451384852552,\n \"mc2\": 0.6156128002968497,\n \"mc2_stderr\": 0.015661943081276854\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.01131779878162692\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5481425322213799,\n \"acc_stderr\": 0.013708494995677644\n }\n}\n```", "repo_url": "https://huggingface.co/freecs/ThetaWave-7B-v0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|arc:challenge|25_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|gsm8k|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hellaswag|10_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T21-58-24.571103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["**/details_harness|winogrande|5_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T21-58-24.571103.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T21_58_24.571103", "path": ["results_2024-01-18T21-58-24.571103.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T21-58-24.571103.parquet"]}]}]} | 2024-01-18T22:01:06+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v0
Dataset automatically created during the evaluation run of model freecs/ThetaWave-7B-v0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T21:58:24.571103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v0\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-7B-v0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T21:58:24.571103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of freecs/ThetaWave-7B-v0\n\n\n\nDataset automatically created during the evaluation run of model freecs/ThetaWave-7B-v0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T21:58:24.571103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
fb42dd891b4cd4f6a4fa0d6516965ed4881fc059 |
# PPC - Polish Paraphrase Corpus
### Dataset Summary
Polish Paraphrase Corpus contains 7000 manually labeled sentence pairs. The dataset was divided into training, validation and test splits. The training part includes 5000 examples, while the other parts contain 1000 examples each. The main purpose of creating such a dataset was to verify how machine learning models perform in the challenging problem of paraphrase identification, where most records contain semantically overlapping parts. Technically, this is a three-class classification task, where each record can be assigned to one of the following categories:
- Exact paraphrases - Sentence pairs that convey exactly the same information. We are interested only in the semantic meaning of the sentence, therefore this category also includes sentences that are semantically identical but, for example, have different emotional emphasis.
- Close paraphrases - Sentence pairs with similar semantic meaning. In this category we include all pairs which contain the same information, but in addition to it there may be other semantically non-overlapping parts. This category also contains context-dependent paraphrases - sentence pairs that may have the same meaning in some contexts but are different in others.
- Non-paraphrases - All other cases, including contradictory sentences and semantically unrelated sentences.
The corpus contains 2911, 1297, and 2792 examples for the above three categories, respectively. The process of annotating the dataset was preceded by an automated generation of candidate pairs, which were then manually labeled. We experimented with two popular techniques of generating possible paraphrases: backtranslation with a set of neural machine translation models and paraphrase mining using a pre-trained multilingual sentence encoder. The extracted sentence pairs are drawn from different data sources: Taboeba, Polish news articles, Wikipedia and Polish version of SICK dataset. Since most of the sentence pairs obtained in this way fell into the first two categories, in order to balance the dataset, some of the examples were manually modified to convey different information. In this way, even negative examples often have high semantic overlap, making this problem difficult for machine learning models.
### Data Instances
Example instance:
```
{
"sentence_A": "Libia: lotnisko w w Trypolisie ostrzelane rakietami.",
"sentence_B": "Jedyne lotnisko w stolicy Libii - Trypolisie zostało w nocy z wtorku na środę ostrzelane rakietami.",
"label": "2"
}
```
### Data Fields
- sentence_A: first sentence text
- sentence_B: second sentence text
- label: label identifier corresponding to one of three categories
### Citation Information
```
@inproceedings{9945218,
author={Dadas, S{\l}awomir},
booktitle={2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC)},
title={Training Effective Neural Sentence Encoders from Automatically Mined Paraphrases},
year={2022},
volume={},
number={},
pages={371-378},
doi={10.1109/SMC53654.2022.9945218}
}
``` | djstrong/ppc | [
"task_categories:text-classification",
"task_ids:semantic-similarity-classification",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"language:pl",
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-01-18T22:06:56+00:00 | {"language": ["pl"], "license": ["cc-by-nc-sa-4.0"], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"], "task_ids": ["semantic-similarity-classification"], "pretty_name": "Polish Paraphrase Corpus", "dataset_info": {"features": [{"name": "sentence_A", "dtype": "string"}, {"name": "sentence_B", "dtype": "string"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "not used", "1": "exact paraphrases", "2": "similar sentences", "3": "non-paraphrases"}}}}], "splits": [{"name": "train", "num_bytes": 539121, "num_examples": 5000}, {"name": "validation", "num_bytes": 107010, "num_examples": 1000}, {"name": "test", "num_bytes": 106515, "num_examples": 1000}]}} | 2024-01-18T22:07:42+00:00 | [] | [
"pl"
] | TAGS
#task_categories-text-classification #task_ids-semantic-similarity-classification #multilinguality-monolingual #size_categories-1K<n<10K #language-Polish #license-cc-by-nc-sa-4.0 #region-us
|
# PPC - Polish Paraphrase Corpus
### Dataset Summary
Polish Paraphrase Corpus contains 7000 manually labeled sentence pairs. The dataset was divided into training, validation and test splits. The training part includes 5000 examples, while the other parts contain 1000 examples each. The main purpose of creating such a dataset was to verify how machine learning models perform in the challenging problem of paraphrase identification, where most records contain semantically overlapping parts. Technically, this is a three-class classification task, where each record can be assigned to one of the following categories:
- Exact paraphrases - Sentence pairs that convey exactly the same information. We are interested only in the semantic meaning of the sentence, therefore this category also includes sentences that are semantically identical but, for example, have different emotional emphasis.
- Close paraphrases - Sentence pairs with similar semantic meaning. In this category we include all pairs which contain the same information, but in addition to it there may be other semantically non-overlapping parts. This category also contains context-dependent paraphrases - sentence pairs that may have the same meaning in some contexts but are different in others.
- Non-paraphrases - All other cases, including contradictory sentences and semantically unrelated sentences.
The corpus contains 2911, 1297, and 2792 examples for the above three categories, respectively. The process of annotating the dataset was preceded by an automated generation of candidate pairs, which were then manually labeled. We experimented with two popular techniques of generating possible paraphrases: backtranslation with a set of neural machine translation models and paraphrase mining using a pre-trained multilingual sentence encoder. The extracted sentence pairs are drawn from different data sources: Taboeba, Polish news articles, Wikipedia and Polish version of SICK dataset. Since most of the sentence pairs obtained in this way fell into the first two categories, in order to balance the dataset, some of the examples were manually modified to convey different information. In this way, even negative examples often have high semantic overlap, making this problem difficult for machine learning models.
### Data Instances
Example instance:
### Data Fields
- sentence_A: first sentence text
- sentence_B: second sentence text
- label: label identifier corresponding to one of three categories
| [
"# PPC - Polish Paraphrase Corpus",
"### Dataset Summary\n\nPolish Paraphrase Corpus contains 7000 manually labeled sentence pairs. The dataset was divided into training, validation and test splits. The training part includes 5000 examples, while the other parts contain 1000 examples each. The main purpose of creating such a dataset was to verify how machine learning models perform in the challenging problem of paraphrase identification, where most records contain semantically overlapping parts. Technically, this is a three-class classification task, where each record can be assigned to one of the following categories:\n- Exact paraphrases - Sentence pairs that convey exactly the same information. We are interested only in the semantic meaning of the sentence, therefore this category also includes sentences that are semantically identical but, for example, have different emotional emphasis.\n- Close paraphrases - Sentence pairs with similar semantic meaning. In this category we include all pairs which contain the same information, but in addition to it there may be other semantically non-overlapping parts. This category also contains context-dependent paraphrases - sentence pairs that may have the same meaning in some contexts but are different in others.\n- Non-paraphrases - All other cases, including contradictory sentences and semantically unrelated sentences.\n\nThe corpus contains 2911, 1297, and 2792 examples for the above three categories, respectively. The process of annotating the dataset was preceded by an automated generation of candidate pairs, which were then manually labeled. We experimented with two popular techniques of generating possible paraphrases: backtranslation with a set of neural machine translation models and paraphrase mining using a pre-trained multilingual sentence encoder. The extracted sentence pairs are drawn from different data sources: Taboeba, Polish news articles, Wikipedia and Polish version of SICK dataset. Since most of the sentence pairs obtained in this way fell into the first two categories, in order to balance the dataset, some of the examples were manually modified to convey different information. In this way, even negative examples often have high semantic overlap, making this problem difficult for machine learning models.",
"### Data Instances\n\nExample instance:",
"### Data Fields\n\n- sentence_A: first sentence text\n- sentence_B: second sentence text\n- label: label identifier corresponding to one of three categories"
] | [
"TAGS\n#task_categories-text-classification #task_ids-semantic-similarity-classification #multilinguality-monolingual #size_categories-1K<n<10K #language-Polish #license-cc-by-nc-sa-4.0 #region-us \n",
"# PPC - Polish Paraphrase Corpus",
"### Dataset Summary\n\nPolish Paraphrase Corpus contains 7000 manually labeled sentence pairs. The dataset was divided into training, validation and test splits. The training part includes 5000 examples, while the other parts contain 1000 examples each. The main purpose of creating such a dataset was to verify how machine learning models perform in the challenging problem of paraphrase identification, where most records contain semantically overlapping parts. Technically, this is a three-class classification task, where each record can be assigned to one of the following categories:\n- Exact paraphrases - Sentence pairs that convey exactly the same information. We are interested only in the semantic meaning of the sentence, therefore this category also includes sentences that are semantically identical but, for example, have different emotional emphasis.\n- Close paraphrases - Sentence pairs with similar semantic meaning. In this category we include all pairs which contain the same information, but in addition to it there may be other semantically non-overlapping parts. This category also contains context-dependent paraphrases - sentence pairs that may have the same meaning in some contexts but are different in others.\n- Non-paraphrases - All other cases, including contradictory sentences and semantically unrelated sentences.\n\nThe corpus contains 2911, 1297, and 2792 examples for the above three categories, respectively. The process of annotating the dataset was preceded by an automated generation of candidate pairs, which were then manually labeled. We experimented with two popular techniques of generating possible paraphrases: backtranslation with a set of neural machine translation models and paraphrase mining using a pre-trained multilingual sentence encoder. The extracted sentence pairs are drawn from different data sources: Taboeba, Polish news articles, Wikipedia and Polish version of SICK dataset. Since most of the sentence pairs obtained in this way fell into the first two categories, in order to balance the dataset, some of the examples were manually modified to convey different information. In this way, even negative examples often have high semantic overlap, making this problem difficult for machine learning models.",
"### Data Instances\n\nExample instance:",
"### Data Fields\n\n- sentence_A: first sentence text\n- sentence_B: second sentence text\n- label: label identifier corresponding to one of three categories"
] |
6cdb3e0db9c63e357d31051fafe5d1a35f603166 |
# Dataset Card for Evaluation run of TeeZee/Kyllene-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/Kyllene-v1.0](https://huggingface.co/TeeZee/Kyllene-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__Kyllene-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T22:11:48.814453](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Kyllene-v1.0/blob/main/results_2024-01-18T22-11-48.814453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7242799086166349,
"acc_stderr": 0.029600540759891002,
"acc_norm": 0.7337880683043764,
"acc_norm_stderr": 0.030171387305390412,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.578864011358949,
"mc2_stderr": 0.01619751826500765
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.01409781067804219,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600938
},
"harness|hellaswag|10": {
"acc": 0.6635132443736308,
"acc_stderr": 0.004715419139697522,
"acc_norm": 0.8450507866958773,
"acc_norm_stderr": 0.0036111673029597764
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.02916263159684399,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.02916263159684399
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7489361702127659,
"acc_stderr": 0.02834696377716246,
"acc_norm": 0.7489361702127659,
"acc_norm_stderr": 0.02834696377716246
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6402116402116402,
"acc_stderr": 0.024718075944129284,
"acc_norm": 0.6402116402116402,
"acc_norm_stderr": 0.024718075944129284
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6108374384236454,
"acc_stderr": 0.0343046241610387,
"acc_norm": 0.6108374384236454,
"acc_norm_stderr": 0.0343046241610387
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.02193804773885311,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.02193804773885311
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909044,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909044
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7871794871794872,
"acc_stderr": 0.020752423722128002,
"acc_norm": 0.7871794871794872,
"acc_norm_stderr": 0.020752423722128002
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.029958249250082118,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.029958249250082118
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.024762902678057926,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.024762902678057926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862093,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862093
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.02219857103945679,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.02219857103945679
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746793,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746793
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094713,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.03680918141673881,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.03680918141673881
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.029634717272371047,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.029634717272371047
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625856,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625856
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.01123826083164834,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.01123826083164834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6547486033519553,
"acc_stderr": 0.01590143260893036,
"acc_norm": 0.6547486033519553,
"acc_norm_stderr": 0.01590143260893036
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02229285828456807,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02229285828456807
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.021330868762127055,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.021330868762127055
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6134751773049646,
"acc_stderr": 0.029049190342543465,
"acc_norm": 0.6134751773049646,
"acc_norm_stderr": 0.029049190342543465
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.559973924380704,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.559973924380704,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.02456220431414231,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.02456220431414231
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.016500472979024808,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.016500472979024808
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.578864011358949,
"mc2_stderr": 0.01619751826500765
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939326
},
"harness|gsm8k|5": {
"acc": 0.30401819560272936,
"acc_stderr": 0.012670420440198654
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TeeZee__Kyllene-v1.0 | [
"region:us"
] | 2024-01-18T22:14:00+00:00 | {"pretty_name": "Evaluation run of TeeZee/Kyllene-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [TeeZee/Kyllene-v1.0](https://huggingface.co/TeeZee/Kyllene-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__Kyllene-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T22:11:48.814453](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__Kyllene-v1.0/blob/main/results_2024-01-18T22-11-48.814453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7242799086166349,\n \"acc_stderr\": 0.029600540759891002,\n \"acc_norm\": 0.7337880683043764,\n \"acc_norm_stderr\": 0.030171387305390412,\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.578864011358949,\n \"mc2_stderr\": 0.01619751826500765\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.01409781067804219,\n \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600938\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6635132443736308,\n \"acc_stderr\": 0.004715419139697522,\n \"acc_norm\": 0.8450507866958773,\n \"acc_norm_stderr\": 0.0036111673029597764\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.02916263159684399,\n \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.02916263159684399\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7489361702127659,\n \"acc_stderr\": 0.02834696377716246,\n \"acc_norm\": 0.7489361702127659,\n \"acc_norm_stderr\": 0.02834696377716246\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6402116402116402,\n \"acc_stderr\": 0.024718075944129284,\n \"acc_norm\": 0.6402116402116402,\n \"acc_norm_stderr\": 0.024718075944129284\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6108374384236454,\n \"acc_stderr\": 0.0343046241610387,\n \"acc_norm\": 0.6108374384236454,\n \"acc_norm_stderr\": 0.0343046241610387\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.02193804773885311,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.02193804773885311\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909044,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909044\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7871794871794872,\n \"acc_stderr\": 0.020752423722128002,\n \"acc_norm\": 0.7871794871794872,\n \"acc_norm_stderr\": 0.020752423722128002\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.029958249250082118,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.029958249250082118\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.024762902678057926,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.024762902678057926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862093,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862093\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.03256850570293647,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.03256850570293647\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945679,\n \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945679\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746793,\n \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746793\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n \"acc_stderr\": 0.028380391147094713,\n \"acc_norm\": 0.7668161434977578,\n \"acc_norm_stderr\": 0.028380391147094713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.03680918141673881,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.03680918141673881\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371047,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371047\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.018315891685625856,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.018315891685625856\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.01123826083164834,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.01123826083164834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6547486033519553,\n \"acc_stderr\": 0.01590143260893036,\n \"acc_norm\": 0.6547486033519553,\n \"acc_norm_stderr\": 0.01590143260893036\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02229285828456807,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02229285828456807\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.021330868762127055,\n \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.021330868762127055\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6134751773049646,\n \"acc_stderr\": 0.029049190342543465,\n \"acc_norm\": 0.6134751773049646,\n \"acc_norm_stderr\": 0.029049190342543465\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.559973924380704,\n \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.559973924380704,\n \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02456220431414231,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02456220431414231\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.016500472979024808,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.016500472979024808\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.578864011358949,\n \"mc2_stderr\": 0.01619751826500765\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939326\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30401819560272936,\n \"acc_stderr\": 0.012670420440198654\n }\n}\n```", "repo_url": "https://huggingface.co/TeeZee/Kyllene-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|arc:challenge|25_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|gsm8k|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hellaswag|10_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T22-11-48.814453.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["**/details_harness|winogrande|5_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T22-11-48.814453.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T22_11_48.814453", "path": ["results_2024-01-18T22-11-48.814453.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T22-11-48.814453.parquet"]}]}]} | 2024-01-18T22:14:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of TeeZee/Kyllene-v1.0
Dataset automatically created during the evaluation run of model TeeZee/Kyllene-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T22:11:48.814453(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of TeeZee/Kyllene-v1.0\n\n\n\nDataset automatically created during the evaluation run of model TeeZee/Kyllene-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T22:11:48.814453(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TeeZee/Kyllene-v1.0\n\n\n\nDataset automatically created during the evaluation run of model TeeZee/Kyllene-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T22:11:48.814453(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ed0413a40e113bf522f261bdda7d5395354ee090 |
# Dataset Card for Evaluation run of ValiantLabs/Fireplace-13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ValiantLabs/Fireplace-13b](https://huggingface.co/ValiantLabs/Fireplace-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ValiantLabs__Fireplace-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T22:29:29.742832](https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__Fireplace-13b/blob/main/results_2024-01-18T22-29-29.742832.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4379234507249955,
"acc_stderr": 0.034366003843291505,
"acc_norm": 0.44072254848921605,
"acc_norm_stderr": 0.03509834671600673,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.01630598864892061,
"mc2": 0.48240026363735705,
"mc2_stderr": 0.015361662513373581
},
"harness|arc:challenge|25": {
"acc": 0.4325938566552901,
"acc_stderr": 0.014478005694182528,
"acc_norm": 0.47696245733788395,
"acc_norm_stderr": 0.01459587320535827
},
"harness|hellaswag|10": {
"acc": 0.5217088229436367,
"acc_stderr": 0.004985076094464753,
"acc_norm": 0.6960764787890859,
"acc_norm_stderr": 0.0045901000501988335
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416908,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416908
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432564,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45161290322580644,
"acc_stderr": 0.028310500348568385,
"acc_norm": 0.45161290322580644,
"acc_norm_stderr": 0.028310500348568385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427524,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427524
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5393939393939394,
"acc_stderr": 0.03892207016552012,
"acc_norm": 0.5393939393939394,
"acc_norm_stderr": 0.03892207016552012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5707070707070707,
"acc_stderr": 0.03526552724601199,
"acc_norm": 0.5707070707070707,
"acc_norm_stderr": 0.03526552724601199
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.49740932642487046,
"acc_stderr": 0.03608390745384486,
"acc_norm": 0.49740932642487046,
"acc_norm_stderr": 0.03608390745384486
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602364,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602364
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5394495412844037,
"acc_stderr": 0.02137049460999509,
"acc_norm": 0.5394495412844037,
"acc_norm_stderr": 0.02137049460999509
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.40458015267175573,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.40458015267175573,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.04948637324026637,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.04948637324026637
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.028911208802749448,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.028911208802749448
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.561941251596424,
"acc_stderr": 0.017742232238257244,
"acc_norm": 0.561941251596424,
"acc_norm_stderr": 0.017742232238257244
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.02688264343402289,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.02688264343402289
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527813,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527813
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.02839677044411129,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.02839677044411129
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4845679012345679,
"acc_stderr": 0.0278074900442762,
"acc_norm": 0.4845679012345679,
"acc_norm_stderr": 0.0278074900442762
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963768,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963768
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32790091264667537,
"acc_stderr": 0.011989936640666535,
"acc_norm": 0.32790091264667537,
"acc_norm_stderr": 0.011989936640666535
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29044117647058826,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.29044117647058826,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3872549019607843,
"acc_stderr": 0.01970687580408562,
"acc_norm": 0.3872549019607843,
"acc_norm_stderr": 0.01970687580408562
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49795918367346936,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.49795918367346936,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.035333892347392454,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.035333892347392454
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6023391812865497,
"acc_stderr": 0.0375363895576169,
"acc_norm": 0.6023391812865497,
"acc_norm_stderr": 0.0375363895576169
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.01630598864892061,
"mc2": 0.48240026363735705,
"mc2_stderr": 0.015361662513373581
},
"harness|winogrande|5": {
"acc": 0.6716653512233622,
"acc_stderr": 0.01319829944971789
},
"harness|gsm8k|5": {
"acc": 0.2577710386656558,
"acc_stderr": 0.012048370213576602
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ValiantLabs__Fireplace-13b | [
"region:us"
] | 2024-01-18T22:31:51+00:00 | {"pretty_name": "Evaluation run of ValiantLabs/Fireplace-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ValiantLabs/Fireplace-13b](https://huggingface.co/ValiantLabs/Fireplace-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ValiantLabs__Fireplace-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T22:29:29.742832](https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__Fireplace-13b/blob/main/results_2024-01-18T22-29-29.742832.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4379234507249955,\n \"acc_stderr\": 0.034366003843291505,\n \"acc_norm\": 0.44072254848921605,\n \"acc_norm_stderr\": 0.03509834671600673,\n \"mc1\": 0.3182374541003672,\n \"mc1_stderr\": 0.01630598864892061,\n \"mc2\": 0.48240026363735705,\n \"mc2_stderr\": 0.015361662513373581\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4325938566552901,\n \"acc_stderr\": 0.014478005694182528,\n \"acc_norm\": 0.47696245733788395,\n \"acc_norm_stderr\": 0.01459587320535827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5217088229436367,\n \"acc_stderr\": 0.004985076094464753,\n \"acc_norm\": 0.6960764787890859,\n \"acc_norm_stderr\": 0.0045901000501988335\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.04033565667848319,\n \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.04033565667848319\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.03036505082911521,\n \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.03036505082911521\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.03669072477416908,\n \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.03669072477416908\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432564,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432564\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45161290322580644,\n \"acc_stderr\": 0.028310500348568385,\n \"acc_norm\": 0.45161290322580644,\n \"acc_norm_stderr\": 0.028310500348568385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427524,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427524\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5393939393939394,\n \"acc_stderr\": 0.03892207016552012,\n \"acc_norm\": 0.5393939393939394,\n \"acc_norm_stderr\": 0.03892207016552012\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5707070707070707,\n \"acc_stderr\": 0.03526552724601199,\n \"acc_norm\": 0.5707070707070707,\n \"acc_norm_stderr\": 0.03526552724601199\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.49740932642487046,\n \"acc_stderr\": 0.03608390745384486,\n \"acc_norm\": 0.49740932642487046,\n \"acc_norm_stderr\": 0.03608390745384486\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602364,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602364\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.031499305777849054,\n \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.031499305777849054\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5394495412844037,\n \"acc_stderr\": 0.02137049460999509,\n \"acc_norm\": 0.5394495412844037,\n \"acc_norm_stderr\": 0.02137049460999509\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3194444444444444,\n \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.0317987634217685\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5637254901960784,\n \"acc_stderr\": 0.03480693138457039,\n \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.03480693138457039\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.04948637324026637,\n \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.04948637324026637\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n \"acc_stderr\": 0.028911208802749448,\n \"acc_norm\": 0.7350427350427351,\n \"acc_norm_stderr\": 0.028911208802749448\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.561941251596424,\n \"acc_stderr\": 0.017742232238257244,\n \"acc_norm\": 0.561941251596424,\n \"acc_norm_stderr\": 0.017742232238257244\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.47398843930635837,\n \"acc_stderr\": 0.02688264343402289,\n \"acc_norm\": 0.47398843930635837,\n \"acc_norm_stderr\": 0.02688264343402289\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n \"acc_stderr\": 0.014635185616527813,\n \"acc_norm\": 0.2581005586592179,\n \"acc_norm_stderr\": 0.014635185616527813\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.028332397483664274,\n \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.028332397483664274\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n \"acc_stderr\": 0.02839677044411129,\n \"acc_norm\": 0.49517684887459806,\n \"acc_norm_stderr\": 0.02839677044411129\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963768,\n \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963768\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32790091264667537,\n \"acc_stderr\": 0.011989936640666535,\n \"acc_norm\": 0.32790091264667537,\n \"acc_norm_stderr\": 0.011989936640666535\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.29044117647058826,\n \"acc_stderr\": 0.02757646862274053,\n \"acc_norm\": 0.29044117647058826,\n \"acc_norm_stderr\": 0.02757646862274053\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3872549019607843,\n \"acc_stderr\": 0.01970687580408562,\n \"acc_norm\": 0.3872549019607843,\n \"acc_norm_stderr\": 0.01970687580408562\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.49795918367346936,\n \"acc_stderr\": 0.0320089533497105,\n \"acc_norm\": 0.49795918367346936,\n \"acc_norm_stderr\": 0.0320089533497105\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n \"acc_stderr\": 0.035333892347392454,\n \"acc_norm\": 0.5174129353233831,\n \"acc_norm_stderr\": 0.035333892347392454\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6023391812865497,\n \"acc_stderr\": 0.0375363895576169,\n \"acc_norm\": 0.6023391812865497,\n \"acc_norm_stderr\": 0.0375363895576169\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n \"mc1_stderr\": 0.01630598864892061,\n \"mc2\": 0.48240026363735705,\n \"mc2_stderr\": 0.015361662513373581\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6716653512233622,\n \"acc_stderr\": 0.01319829944971789\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2577710386656558,\n \"acc_stderr\": 0.012048370213576602\n }\n}\n```", "repo_url": "https://huggingface.co/ValiantLabs/Fireplace-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|arc:challenge|25_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|gsm8k|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hellaswag|10_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T22-29-29.742832.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["**/details_harness|winogrande|5_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T22-29-29.742832.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T22_29_29.742832", "path": ["results_2024-01-18T22-29-29.742832.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T22-29-29.742832.parquet"]}]}]} | 2024-01-18T22:32:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ValiantLabs/Fireplace-13b
Dataset automatically created during the evaluation run of model ValiantLabs/Fireplace-13b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T22:29:29.742832(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ValiantLabs/Fireplace-13b\n\n\n\nDataset automatically created during the evaluation run of model ValiantLabs/Fireplace-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T22:29:29.742832(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ValiantLabs/Fireplace-13b\n\n\n\nDataset automatically created during the evaluation run of model ValiantLabs/Fireplace-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T22:29:29.742832(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
fac0bd5277e21df5c9bf73c1acc817a397a2e83b | # Dataset Card for "cybersec_embedding_llama_chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Yemmy1000/cybersec_embedding_llama_chat | [
"region:us"
] | 2024-01-18T23:21:30+00:00 | {"dataset_info": {"features": [{"name": "INSTRUCTION", "dtype": "string"}, {"name": "RESPONSE", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5951997, "num_examples": 7697}], "download_size": 2761782, "dataset_size": 5951997}} | 2024-01-18T23:21:32+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cybersec_embedding_llama_chat"
More Information needed | [
"# Dataset Card for \"cybersec_embedding_llama_chat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cybersec_embedding_llama_chat\"\n\nMore Information needed"
] |
0536018fe71cf604889fed7bc990412adce7b602 |
# NuScenes-QA-mini Dataset
## TL;DR:
This dataset is used for multimodal question-answering tasks in autonomous driving scenarios. We created this dataset based on [nuScenes-QA dataset](https://github.com/qiantianwen/NuScenes-QA) for evaluation in our paper [Modality Plug-and-Play: Elastic Modality Adaptation in Multimodal LLMs for Embodied AI](https://arxiv.org/abs/2312.07886). The samples are divided into day and night scenes.
|scene|# train samples|# validation samples|
|---|---|---|
|day|2,229|2,229|
|night|659|659|
|Each sample contains|
|---|
|original token id in nuscenes database|
|RGB images from 6 views (front, front left, front right, back, back left, back right)|
|5D LiDAR point cloud (distance, intensity, X, Y, and Z axes)|
|question-answer pairs|
## Detailed Description
This dataset is built on the [nuScenes](https://www.nuscenes.org/) mini-split, where we obtain the QA pairs from the original [nuScenes-QA dataset](https://github.com/qiantianwen/NuScenes-QA). The data in the nuScenes-QA dataset is collected from driving scenes in cities of Boston and Singapore with diverse locations, time, and weather conditions.
<img src="nuqa_example.PNG" alt="fig1" width="600"/>
Each data sample contains **6-view RGB camera captures, a 5D LiDAR point cloud, and a corresponding text QA pair**. Each LiDAR point cloud includes 5 dimensions of data about distance, intensity, X, Y, and Z axes. In this dataset, the questions are generally difficult, and may require multiple hops of reasoning over the RGB and LiDAR data. For example, to answer the sample question in the above figure, the ML model needs to first identify in which direction the “construction vehicle” appears, and then counts the number of “parked trucks” in that direction. In our evaluations, we further cast the question-answering (QA) as an open-ended text generation task. This is more challenging than the evaluation setup in the original nuScenes-QA [paper](https://arxiv.org/abs/2305.14836), where an answer set is predefined and the QA task is a classification task over this predefined answer set.
<img src="image_darken.PNG" alt="fig2" width="600"/>
In most RGB images in the nuScenes dataset, as shown in the above figure - Left, the lighting conditions in night scenes are still abundant (e.g., with street lights), and we hence further reduce the brightness of RGB captures in night scenes by 80% and apply Gaussian blur with a radius of 7, as shown in the above figure - Right. By applying such preprocessing to the RGB views in night scenes, we obtain the training and validation splits of night scenes with 659 samples for each split. On the other hand, the RGB views in daytime scenes remain as the origin. The day split contains 2,229 for training and 2,229 for validation respectively.
## How to Use
```py
from datasets import load_dataset
# load train split in day scene
day_train = load_dataset("KevinNotSmile/nuscenes-qa-mini", "day", split="train")
```
## Citation
If you find our dataset useful, please consider citing
```
@inproceedings{caesar2020nuscenes,
title={nuscenes: A multimodal dataset for autonomous driving},
author={Caesar, Holger and Bankiti, Varun and Lang, Alex H and Vora, Sourabh and Liong, Venice Erin and Xu, Qiang and Krishnan, Anush and Pan, Yu and Baldan, Giancarlo and Beijbom, Oscar},
booktitle={Proceedings of the IEEE/CVF conference on computer vision and pattern recognition},
pages={11621--11631},
year={2020}
}
@article{qian2023nuscenes,
title={NuScenes-QA: A Multi-modal Visual Question Answering Benchmark for Autonomous Driving Scenario},
author={Qian, Tianwen and Chen, Jingjing and Zhuo, Linhai and Jiao, Yang and Jiang, Yu-Gang},
journal={arXiv preprint arXiv:2305.14836},
year={2023}
}
@article{huang2023modality,
title={Modality Plug-and-Play: Elastic Modality Adaptation in Multimodal LLMs for Embodied AI},
author={Huang, Kai and Yang, Boyuan and Gao, Wei},
journal={arXiv preprint arXiv:2312.07886},
year={2023}
}
```
License
===================================================================================================
[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]
Being aligned with original nuScenes' license, this work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg
| KevinNotSmile/nuscenes-qa-mini | [
"task_categories:visual-question-answering",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-nc-sa-4.0",
"arxiv:2312.07886",
"arxiv:2305.14836",
"region:us"
] | 2024-01-18T23:31:23+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["visual-question-answering", "text-generation"], "configs": [{"config_name": "day", "data_files": [{"split": "train", "path": "day-train/*"}, {"split": "validation", "path": "day-validation/*"}]}, {"config_name": "night", "data_files": [{"split": "train", "path": "night-train/*"}, {"split": "validation", "path": "night-validation/*"}]}]} | 2024-01-19T03:02:03+00:00 | [
"2312.07886",
"2305.14836"
] | [
"en"
] | TAGS
#task_categories-visual-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #arxiv-2312.07886 #arxiv-2305.14836 #region-us
| NuScenes-QA-mini Dataset
========================
TL;DR:
------
This dataset is used for multimodal question-answering tasks in autonomous driving scenarios. We created this dataset based on nuScenes-QA dataset for evaluation in our paper Modality Plug-and-Play: Elastic Modality Adaptation in Multimodal LLMs for Embodied AI. The samples are divided into day and night scenes.
scene: day, # train samples: 2,229, # validation samples: 2,229
scene: night, # train samples: 659, # validation samples: 659
Detailed Description
--------------------
This dataset is built on the nuScenes mini-split, where we obtain the QA pairs from the original nuScenes-QA dataset. The data in the nuScenes-QA dataset is collected from driving scenes in cities of Boston and Singapore with diverse locations, time, and weather conditions.

Each data sample contains 6-view RGB camera captures, a 5D LiDAR point cloud, and a corresponding text QA pair. Each LiDAR point cloud includes 5 dimensions of data about distance, intensity, X, Y, and Z axes. In this dataset, the questions are generally difficult, and may require multiple hops of reasoning over the RGB and LiDAR data. For example, to answer the sample question in the above figure, the ML model needs to first identify in which direction the “construction vehicle” appears, and then counts the number of “parked trucks” in that direction. In our evaluations, we further cast the question-answering (QA) as an open-ended text generation task. This is more challenging than the evaluation setup in the original nuScenes-QA paper, where an answer set is predefined and the QA task is a classification task over this predefined answer set.

In most RGB images in the nuScenes dataset, as shown in the above figure - Left, the lighting conditions in night scenes are still abundant (e.g., with street lights), and we hence further reduce the brightness of RGB captures in night scenes by 80% and apply Gaussian blur with a radius of 7, as shown in the above figure - Right. By applying such preprocessing to the RGB views in night scenes, we obtain the training and validation splits of night scenes with 659 samples for each split. On the other hand, the RGB views in daytime scenes remain as the origin. The day split contains 2,229 for training and 2,229 for validation respectively.
How to Use
----------
If you find our dataset useful, please consider citing
License
=======
[](URL)
Being aligned with original nuScenes' license, this work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](URL).
[](URL)
| [
"# train samples: 2,229, # validation samples: 2,229\nscene: night, # train samples: 659, # validation samples: 659\n\n\n\nDetailed Description\n--------------------\n\n\nThis dataset is built on the nuScenes mini-split, where we obtain the QA pairs from the original nuScenes-QA dataset. The data in the nuScenes-QA dataset is collected from driving scenes in cities of Boston and Singapore with diverse locations, time, and weather conditions.\n\n\n\nEach data sample contains 6-view RGB camera captures, a 5D LiDAR point cloud, and a corresponding text QA pair. Each LiDAR point cloud includes 5 dimensions of data about distance, intensity, X, Y, and Z axes. In this dataset, the questions are generally difficult, and may require multiple hops of reasoning over the RGB and LiDAR data. For example, to answer the sample question in the above figure, the ML model needs to first identify in which direction the “construction vehicle” appears, and then counts the number of “parked trucks” in that direction. In our evaluations, we further cast the question-answering (QA) as an open-ended text generation task. This is more challenging than the evaluation setup in the original nuScenes-QA paper, where an answer set is predefined and the QA task is a classification task over this predefined answer set.\n\n\n\nIn most RGB images in the nuScenes dataset, as shown in the above figure - Left, the lighting conditions in night scenes are still abundant (e.g., with street lights), and we hence further reduce the brightness of RGB captures in night scenes by 80% and apply Gaussian blur with a radius of 7, as shown in the above figure - Right. By applying such preprocessing to the RGB views in night scenes, we obtain the training and validation splits of night scenes with 659 samples for each split. On the other hand, the RGB views in daytime scenes remain as the origin. The day split contains 2,229 for training and 2,229 for validation respectively.\n\n\nHow to Use\n----------\n\n\nIf you find our dataset useful, please consider citing\n\n\nLicense\n=======\n\n\n[](URL)\n\n\nBeing aligned with original nuScenes' license, this work is licensed under a\n[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](URL).\n\n\n[](URL)"
] | [
"TAGS\n#task_categories-visual-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #arxiv-2312.07886 #arxiv-2305.14836 #region-us \n",
"# train samples: 2,229, # validation samples: 2,229\nscene: night, # train samples: 659, # validation samples: 659\n\n\n\nDetailed Description\n--------------------\n\n\nThis dataset is built on the nuScenes mini-split, where we obtain the QA pairs from the original nuScenes-QA dataset. The data in the nuScenes-QA dataset is collected from driving scenes in cities of Boston and Singapore with diverse locations, time, and weather conditions.\n\n\n\nEach data sample contains 6-view RGB camera captures, a 5D LiDAR point cloud, and a corresponding text QA pair. Each LiDAR point cloud includes 5 dimensions of data about distance, intensity, X, Y, and Z axes. In this dataset, the questions are generally difficult, and may require multiple hops of reasoning over the RGB and LiDAR data. For example, to answer the sample question in the above figure, the ML model needs to first identify in which direction the “construction vehicle” appears, and then counts the number of “parked trucks” in that direction. In our evaluations, we further cast the question-answering (QA) as an open-ended text generation task. This is more challenging than the evaluation setup in the original nuScenes-QA paper, where an answer set is predefined and the QA task is a classification task over this predefined answer set.\n\n\n\nIn most RGB images in the nuScenes dataset, as shown in the above figure - Left, the lighting conditions in night scenes are still abundant (e.g., with street lights), and we hence further reduce the brightness of RGB captures in night scenes by 80% and apply Gaussian blur with a radius of 7, as shown in the above figure - Right. By applying such preprocessing to the RGB views in night scenes, we obtain the training and validation splits of night scenes with 659 samples for each split. On the other hand, the RGB views in daytime scenes remain as the origin. The day split contains 2,229 for training and 2,229 for validation respectively.\n\n\nHow to Use\n----------\n\n\nIf you find our dataset useful, please consider citing\n\n\nLicense\n=======\n\n\n[](URL)\n\n\nBeing aligned with original nuScenes' license, this work is licensed under a\n[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](URL).\n\n\n[](URL)"
] |
19426e6cbb84e95bbf16403e2d1c57f8bde453a6 | # Dataset Card for "cai-conversation-dev1705620799"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/cai-conversation-dev1705620799 | [
"region:us"
] | 2024-01-18T23:34:28+00:00 | {"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "sequence": "string"}, {"name": "chosen", "sequence": "string"}, {"name": "rejected", "sequence": "string"}], "splits": [{"name": "train_sft", "num_bytes": 193672, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 187943, "num_examples": 64}, {"name": "test_sft", "num_bytes": 188701, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 194910, "num_examples": 64}], "download_size": 423499, "dataset_size": 765226}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]} | 2024-01-18T23:34:34+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cai-conversation-dev1705620799"
More Information needed | [
"# Dataset Card for \"cai-conversation-dev1705620799\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cai-conversation-dev1705620799\"\n\nMore Information needed"
] |
56b68010e53cf33c48cb5da619f62800683e9024 | # Dataset Card for "cai-conversation-dev1705620998"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/cai-conversation-dev1705620998 | [
"region:us"
] | 2024-01-18T23:37:54+00:00 | {"dataset_info": {"features": [{"name": "init_prompt", "dtype": "string"}, {"name": "init_response", "dtype": "string"}, {"name": "critic_prompt", "dtype": "string"}, {"name": "critic_response", "dtype": "string"}, {"name": "revision_prompt", "dtype": "string"}, {"name": "revision_response", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "messages", "sequence": "string"}, {"name": "chosen", "sequence": "string"}, {"name": "rejected", "sequence": "string"}], "splits": [{"name": "train_sft", "num_bytes": 237227, "num_examples": 64}, {"name": "train_prefs", "num_bytes": 234165, "num_examples": 64}, {"name": "test_sft", "num_bytes": 263146, "num_examples": 64}, {"name": "test_prefs", "num_bytes": 247201, "num_examples": 64}], "download_size": 544968, "dataset_size": 981739}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}, {"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_sft", "path": "data/test_sft-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}]} | 2024-01-18T23:37:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cai-conversation-dev1705620998"
More Information needed | [
"# Dataset Card for \"cai-conversation-dev1705620998\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cai-conversation-dev1705620998\"\n\nMore Information needed"
] |
8329b8917ca7058dec79f70b507afab242065034 |
# Dataset Card for Evaluation run of AA051611/A0119
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/A0119](https://huggingface.co/AA051611/A0119) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__A0119",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T23:36:49.726923](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0119/blob/main/results_2024-01-18T23-36-49.726923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7265379913533245,
"acc_stderr": 0.029728940123723756,
"acc_norm": 0.7314556841951575,
"acc_norm_stderr": 0.030296550379322322,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5795530367826707,
"mc2_stderr": 0.015322289394733637
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536588,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916573
},
"harness|hellaswag|10": {
"acc": 0.6603266281617207,
"acc_stderr": 0.004726304225137318,
"acc_norm": 0.8474407488548098,
"acc_norm_stderr": 0.0035882728748524817
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930384,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899105,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899105
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795717,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795717
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7574468085106383,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.7574468085106383,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7103448275862069,
"acc_stderr": 0.037800192304380156,
"acc_norm": 0.7103448275862069,
"acc_norm_stderr": 0.037800192304380156
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6534391534391535,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.6534391534391535,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.019469334586486937,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.019469334586486937
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6009852216748769,
"acc_stderr": 0.03445487686264716,
"acc_norm": 0.6009852216748769,
"acc_norm_stderr": 0.03445487686264716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503582,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503582
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527012,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527012
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.020473233173551986,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.020473233173551986
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3814814814814815,
"acc_stderr": 0.029616718927497582,
"acc_norm": 0.3814814814814815,
"acc_norm_stderr": 0.029616718927497582
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8277310924369747,
"acc_stderr": 0.02452866497130541,
"acc_norm": 0.8277310924369747,
"acc_norm_stderr": 0.02452866497130541
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683776,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137123,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137123
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692698,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692698
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094713,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476072,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476072
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869623,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869623
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253864,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253864
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747094,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9003831417624522,
"acc_stderr": 0.010709685591251671,
"acc_norm": 0.9003831417624522,
"acc_norm_stderr": 0.010709685591251671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6256983240223464,
"acc_stderr": 0.01618544417945717,
"acc_norm": 0.6256983240223464,
"acc_norm_stderr": 0.01618544417945717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.0231527224394023,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.0231527224394023
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435122,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.02147349183480835,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.02147349183480835
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.029609912075594113,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.029609912075594113
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5482398956975228,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.5482398956975228,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.01677467236546851,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.01677467236546851
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9005847953216374,
"acc_stderr": 0.022949025579355027,
"acc_norm": 0.9005847953216374,
"acc_norm_stderr": 0.022949025579355027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5795530367826707,
"mc2_stderr": 0.015322289394733637
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.5739196360879454,
"acc_stderr": 0.013621144396086709
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051611__A0119 | [
"region:us"
] | 2024-01-18T23:39:03+00:00 | {"pretty_name": "Evaluation run of AA051611/A0119", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/A0119](https://huggingface.co/AA051611/A0119) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__A0119\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T23:36:49.726923](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__A0119/blob/main/results_2024-01-18T23-36-49.726923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7265379913533245,\n \"acc_stderr\": 0.029728940123723756,\n \"acc_norm\": 0.7314556841951575,\n \"acc_norm_stderr\": 0.030296550379322322,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5795530367826707,\n \"mc2_stderr\": 0.015322289394733637\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536588,\n \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6603266281617207,\n \"acc_stderr\": 0.004726304225137318,\n \"acc_norm\": 0.8474407488548098,\n \"acc_norm_stderr\": 0.0035882728748524817\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930384,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899105,\n \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899105\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795717,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795717\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7574468085106383,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.7574468085106383,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.037800192304380156,\n \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.037800192304380156\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6534391534391535,\n \"acc_stderr\": 0.024508777521028424,\n \"acc_norm\": 0.6534391534391535,\n \"acc_norm_stderr\": 0.024508777521028424\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.864516129032258,\n \"acc_stderr\": 0.019469334586486937,\n \"acc_norm\": 0.864516129032258,\n \"acc_norm_stderr\": 0.019469334586486937\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6009852216748769,\n \"acc_stderr\": 0.03445487686264716,\n \"acc_norm\": 0.6009852216748769,\n \"acc_norm_stderr\": 0.03445487686264716\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503582,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503582\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527012,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527012\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.020473233173551986,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.020473233173551986\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497582,\n \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497582\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.02452866497130541,\n \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.02452866497130541\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683776,\n \"acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.908256880733945,\n \"acc_stderr\": 0.012376323409137123,\n \"acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137123\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692698,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692698\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n \"acc_stderr\": 0.028380391147094713,\n \"acc_norm\": 0.7668161434977578,\n \"acc_norm_stderr\": 0.028380391147094713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476072,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476072\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869623,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869623\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253864,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253864\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747094,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.0218552552634218,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.0218552552634218\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6256983240223464,\n \"acc_stderr\": 0.01618544417945717,\n \"acc_norm\": 0.6256983240223464,\n \"acc_norm_stderr\": 0.01618544417945717\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.0231527224394023,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.0231527224394023\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n \"acc_stderr\": 0.023222756797435122,\n \"acc_norm\": 0.7877813504823151,\n \"acc_norm_stderr\": 0.023222756797435122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.02147349183480835,\n \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.02147349183480835\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5602836879432624,\n \"acc_stderr\": 0.029609912075594113,\n \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.029609912075594113\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5482398956975228,\n \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.5482398956975228,\n \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.01677467236546851,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.01677467236546851\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9005847953216374,\n \"acc_stderr\": 0.022949025579355027,\n \"acc_norm\": 0.9005847953216374,\n \"acc_norm_stderr\": 0.022949025579355027\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5795530367826707,\n \"mc2_stderr\": 0.015322289394733637\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5739196360879454,\n \"acc_stderr\": 0.013621144396086709\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/A0119", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|arc:challenge|25_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|gsm8k|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hellaswag|10_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T23-36-49.726923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["**/details_harness|winogrande|5_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T23-36-49.726923.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_18T23_36_49.726923", "path": ["results_2024-01-18T23-36-49.726923.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T23-36-49.726923.parquet"]}]}]} | 2024-01-18T23:39:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051611/A0119
Dataset automatically created during the evaluation run of model AA051611/A0119 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-18T23:36:49.726923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051611/A0119\n\n\n\nDataset automatically created during the evaluation run of model AA051611/A0119 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T23:36:49.726923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051611/A0119\n\n\n\nDataset automatically created during the evaluation run of model AA051611/A0119 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-18T23:36:49.726923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.